by Maiclaire Bolton
Recent changes in earthquake hazard science in California are transforming the way we think about managing earthquake risk. The Uniform California Earthquake Rupture Forecast (UCERF) is a multi-year project by consortia of leading scientists, known as the Working Group on California Earthquake Probabilities (WGCEP), which offers a consensus-based view of earthquake hazard across California.
The latest iteration of UCERF, version 3 (UCERF3), released in 2014 (a time-independent view of hazard) and 2015 (a time-dependent view of hazard) had a key objective to define the frequency and distribution of earthquakes - both on and off seismically active faults in California.
Learn five steps to help your healthcare clients mitigate cyber risks. Download our free whitepaper.
Understanding what has changed
While re-evaluating the magnitude-frequency distribution of earthquakes in California, the UCERF3 team of experts tried to obtain a pattern that matched more closely with the historical record, or what’s known in the seismological community as a pure Gutenberg-Richter relationship.
In the previous version of the model (UCERF2), the magnitude-frequency distribution deviated from a Gutenberg-Richter relationship, in particular, within the range of magnitude 6.5 to 7.5 earthquakes. There was a notable increase in the frequency of these magnitudes, forming a bulge in the magnitude-frequency distribution curve. The UCERF3 expert committee aimed to reduce this bulge in the curve, while maintaining the same total seismic moment, or total energy release, for all potential earthquakes. The only way to achieve this was to relax the fault segmentation, which means they had to allow for earthquakes to rupture on multiple segments of faults, or multiple faults, within close proximity of each other.
What prompted the change?
It’s important to understand why UCERF3 has presented the possibility of multiple-segment and multiple-fault ruptures. This is based on lessons learned from recent earthquakes that have occurred and caused damage around the world. The fundamental lesson from these earthquakes is that faults are more interconnected at depth than previously thought in terms of their rupture and seismic potential.
This lesson was emphasized, very tragically, by the 2011 T
ohoku-oki earthquake in Japan. Prior to this event, the largest earthquake thought to be possible was a magnitude 8.2 that would rupture only the shallowest part of the mega-thrust fault plane off the northern coast of Hokkaido. However, the earthquake that actually occurred cascaded through six different segments of the fault plane, and this multiple-segment fault rupture resulted in an unprecedented magnitude 9.0 earthquake—the largest to ever occur in Japan.
What it means in terms of insured losses
From an insured loss perspective, the decrease in frequency of magnitude 6.5 to 7.5 events does have an impact on losses, decreasing the overall damage at short return periods (i.e. return periods that are associated with risk levels corresponding to 50-100 year loss levels). Conversely the increase in frequency of large events, magnitude 7.5 and greater, quite dramatically impacts higher return periods such that the larger events have the potential of being much more damaging than previously thought.
The US Earthquake Model from
CoreLogic is one of the first to incorporate this important new consensus-based view of earthquake hazards for California, along with new features of liquefaction, landslide, basin modeling, and both time-dependent and time-independent probabilities of occurrence. Using this new science allows for more accurately informed analysis of earthquake risk which will help the insurance industry and communities worldwide assess and prepare for potential damage from earthquakes.
Related articles:
Catastrophe losses spike in North America – Swiss Re
Just 15% of earthquake claims approved since 2010 – report