47 results on '"Page, Morgan"'
Search Results
2. Developing, Testing, and Communicating Earthquake Forecasts: Current Practices and Future Directions.
- Author
-
Mizrahi, Leila, Dallo, Irina, van der Elst, Nicholas J., Christophersen, Annemarie, Spassiani, Ilaria, Werner, Maximilian J., Iturrieta, Pablo, Bayona, José A., Iervolino, Iunio, Schneider, Max, Page, Morgan T., Zhuang, Jiancang, Herrmann, Marcus, Michael, Andrew J., Falcone, Giuseppe, Marzocchi, Warner, Rhoades, David, Gerstenberger, Matt, Gulia, Laura, and Schorlemmer, Danijel
- Subjects
EARTHQUAKE prediction ,SEISMIC event location ,TEST methods ,PROBABILITY theory ,FORECASTING - Abstract
While deterministically predicting the time and location of earthquakes remains impossible, earthquake forecasting models can provide estimates of the probabilities of earthquakes occurring within some region over time. To enable informed decision‐making of civil protection, governmental agencies, or the public, Operational Earthquake Forecasting (OEF) systems aim to provide authoritative earthquake forecasts based on current earthquake activity in near‐real time. Establishing OEF systems involves several nontrivial choices. This review captures the current state of OEF worldwide and analyzes expert recommendations on the development, testing, and communication of earthquake forecasts. An introductory summary of OEF‐related research is followed by a description of OEF systems in Italy, New Zealand, and the United States. Combined, these two parts provide an informative and transparent snapshot of today's OEF landscape. In Section 4, we analyze the results of an expert elicitation that was conducted to seek guidance for the establishment of OEF systems. The elicitation identifies consensus and dissent on OEF issues among a non‐representative group of 20 international earthquake forecasting experts. While the experts agree that communication products should be developed in collaboration with the forecast user groups, they disagree on whether forecasting models and testing methods should be user‐dependent. No recommendations of strict model requirements could be elicited, but benchmark comparisons, prospective testing, reproducibility, and transparency are encouraged. Section 5 gives an outlook on the future of OEF. Besides covering recent research on earthquake forecasting model development and testing, upcoming OEF initiatives are described in the context of the expert elicitation findings. Plain Language Summary: The exact location, time, and magnitude of future earthquakes cannot be predicted. However, based on past earthquake sequences, it is possible to assess probabilities for future earthquakes. This is called earthquake forecasting. Operational Earthquake Forecasting (OEF) systems are designed to provide near‐real‐time authoritative earthquake forecasts, based on current earthquake activity, to aid the decision‐making of various societal stakeholders. Setting up these systems is complex, involving decisions about which model to use, how to best test the model, and how to turn earthquake probability estimates into practical information. This review captures the current state of OEF worldwide and analyzes expert recommendations on the development, testing, and communication of earthquake forecasts. Section 2 provides an overview of OEF‐related research and the background knowledge required to understand the other parts. Section 3 describes existing OEF systems of Italy, New Zealand, and the United States in detail. Section 4 discusses an elicitation of expert views on modeling, testing, and communicating earthquake forecasts (Mizrahi, Dallo, & Kuratle, 2023, https://doi.org/10.3929/ethz‐b‐000637239). Data from the elicitation allow to identify consensus and dissent on OEF issues and provide guidance for future earthquake forecasting efforts. Finally, Section 5 gives an outlook on future OEF‐related research and planned OEF efforts at various institutions. Key Points: We capture the state of earthquake forecasting systems in Italy, New Zealand, and the United States, and future plans in these and other countriesExperts encourage benchmark comparison, prospective testing, reproducibility and transparency, but avoid endorsing specific models or testsExperts stress the need to co‐design forecast communication products with end‐users to ensure their societal relevance and usefulness [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Aftershock Forecasting.
- Author
-
Hardebeck, Jeanne L., Llenos, Andrea L., Michael, Andrew J., Page, Morgan T., Schneider, Max, and van der Elst, Nicholas J.
- Subjects
EARTHQUAKE aftershocks ,EARTHQUAKES ,FORECASTING ,STATISTICAL models ,STATISTICAL matching ,MACHINE learning ,EARTHQUAKE prediction ,BUILDING failures - Abstract
Aftershocks can compound the impacts of a major earthquake, disrupting recovery efforts and potentially further damaging weakened buildings and infrastructure. Forecasts of the probability of aftershocks can therefore aid decision-making during earthquake response and recovery. Several countries issue authoritative aftershock forecasts. Most aftershock forecasts are based on simple statistical models that were first developed in the 1980s and remain the best available models. We review these statistical models and the wide-ranging research to advance aftershock forecasting through better statistical, physical, and machine-learning methods. Physics-based forecasts based on mainshock stress changes can sometimes match the statistical models in testing but do not yet outperform them. Physical models are also hampered by unsolved problems such as the mechanics of dynamic triggering and the influence of background conditions. Initial work on machine-learning forecasts shows promise, and new machine-learning earthquake catalogs provide an opportunity to advance all types of aftershock forecasts. Several countries issue real-time aftershock forecasts following significant earthquakes, providing information to aid response and recovery. Statistical models based on past aftershocks are used to compute aftershock probability as a function of space, time, and magnitude. Aftershock forecasting is advancing through better statistical models, constraints on physical triggering mechanisms, and machine learning. Large high-resolution earthquake catalogs provide an opportunity to advance physical, statistical, and machine-learning aftershock models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Queer Suicidality, Conflict, and Repair
- Author
-
Page, Morgan M. and Schulman, Sarah
- Published
- 2017
- Full Text
- View/download PDF
5. Developing guidance to communicate global aftershock forecasts
- Author
-
Schneider, Max, McBride, Sara K., van der Elst, Nicholas, Hardebeck, Jeanne, Michael, Andrew, and Page, Morgan
- Abstract
The U.S. Geological Survey (USGS) is responsible for public aftershock forecasts following US earthquakes. An automated system produces forecasts for most M5+ earthquakes. While this system is not operational for earthquakes outside the US, the USGS has received requests for forecasts following damaging earthquakes worldwide, particularly those with a high number of fatalities (orange or red level on the PAGER scale). However, aftershock forecasting globally has the inherent challenge of communication across different languages and cultures. Further, aftershock forecasts made from outside the affected region can be a challenge for local science communicators because they may need to respond to questions about a forecast that they may not be familiar with themselves. Effective communication of aftershock forecasts for earthquakes across the world requires developing products that can serve non-English-speakers, and providing local science communicators with tools to help them respond to questions about the forecasts. To support the communication of aftershock forecasts globally, the USGS is developing additional public tools for local science communicators. A communication guide will accompany the forecast template and will be translated into multiple languages. To develop this communication guide, we are facilitating meetings with science communicators in different countries to solicit feedback on its components. Additionally information regarding protective action will be updated. The USGS currently recommends “Drop, Cover, and Hold On”, which may not be appropriate in countries with poorly constructed buildings. By developing additional communication tools, aftershock forecasting will be more effective and accessible to reduce seismic risk worldwide., The 28th IUGG General Assembly (IUGG2023) (Berlin 2023)
- Published
- 2023
6. The Limits of Earthquake Early Warning Accuracy and Best Alerting Strategy
- Author
-
Minson, Sarah E., Baltay, Annemarie S., Cochran, Elizabeth S., Hanks, Thomas C., Page, Morgan T., McBride, Sara K., Milner, Kevin R., and Meier, Men-Andrin
- Published
- 2019
- Full Text
- View/download PDF
7. a‐Positive: A Robust Estimator of the Earthquake Rate in Incomplete or Saturated Catalogs.
- Author
-
van der Elst, Nicholas J. and Page, Morgan T.
- Subjects
- *
EARTHQUAKE aftershocks , *EARTHQUAKES , *EARTHQUAKE hazard analysis , *CATALOGS , *SEISMOGRAMS , *NATURAL disaster warning systems , *CATALOGING - Abstract
Detection thresholds in earthquake catalogs frequently change in time due to station coverage improvements and network saturation effects during active periods such as mainshock‐aftershock cascades. This presents a challenge to seismicity‐rate estimation; there is a tradeoff between using as low a minimum magnitude as possible to maximize data while not undercounting the rate due to catalog incompleteness. Here we present a simple method, "a‐positive," which makes use of differential statistics to robustly estimate the seismicity rate in catalogs with time‐varying detection thresholds. We demonstrate the effectiveness of this method for a centuries‐long, hybrid earthquake catalog with both historical and instrumentally‐detected earthquakes in the Central and Eastern U.S., as well as for the 2019 Ridgecrest aftershock sequence in California, which has rapid changes in completeness due to network saturation. We find that the a‐positive method leads to more precise and less biased estimates of seismicity rate than traditional methods. In addition, with our improved estimate of earthquake rate early in the aftershock cascade, we find no evidence of rate‐saturation at short times from the mainshock; that is, the Omori c‐value is not distinguishable from zero. Plain Language Summary: The seismicity rate is a fundamental aspect of statistical seismology, and it is used to create long‐ term models that inform building codes and seismic hazard assessments. However, earthquake catalogs are imperfect records of earthquakes, due to both changing network coverage and short‐ term aftershock incompleteness, where large earthquakes obscure smaller ones in times of high activity. We here present a new method "a‐positive" to estimate the seismicity rate in catalogs with time‐varying detection thresholds. This method uses differential statistics to estimate the seismicity rate from interevent times, where interevent times are measured to the next larger earthquake, with no reference to a completeness magnitude. The "a‐positive" method provides more precise and less biased estimates of seismicity rate than traditional methods. The method has been tested on a centuries‐long, hybrid earthquake catalog with both historical and instrumentally‐detected earthquakes in the Central and Eastern U.S. and on the 2019 Ridgecrest aftershock sequence in California, which has rapid changes in completeness due to short‐term aftershock incompleteness. The results show that the "a‐positive" method is minimally sensitive to catalog incompleteness and provides an unbiased estimate of the earthquake rate even in the first moments of an aftershock sequence. Key Points: Earthquake catalogs are imperfect records of the earthquake rate, suffering from both network sparseness and short‐term rate saturation"a‐positive" estimates seismicity rate based on the time to the next larger earthquake, with no reference to completeness magnitudeUnbiased measurement of the aftershock rate shows no evidence for rate saturation at early times [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
8. Doubleness
- Author
-
Page, Morgan M
- Published
- 2019
9. The New Madrid Seismic Zone: Not Dead Yet
- Author
-
Page, Morgan T. and Hough, Susan E.
- Published
- 2014
- Full Text
- View/download PDF
10. Apparent earthquake rupture predictability.
- Author
-
Meier, Men-Andrin, Ampuero, Jean-Paul, Cochran, Elizabeth, and Page, Morgan
- Subjects
EARTHQUAKES - Abstract
To what extent can the future evolution of an ongoing earthquake rupture be predicted? This question of fundamental scientific and practical importance has recently been addressed by studies of teleseismic source time functions (STFs) but reaching contrasting conclusions. One study concludes that the initial portion of STFs is the same regardless of magnitude. Another study concludes that the rate at which earthquakes grow increases systematically and strongly with final event magnitudes. Here, we show that the latter reported trend is caused by a selection bias towards events with unusually long durations and by estimates of STF growth made when the STF is already decaying. If these invalid estimates are left out, the trend is no longer present, except during the first few seconds of the smallest events in the data set, M
w 5–6.5, for which the reliability of the STF amplitudes is questionable. Simple synthetic tests show that the observations are consistent with statistically indistinguishable growth of smaller and larger earthquakes. A much weaker trend is apparent among events of comparable duration, but we argue that its significance is not resolvable by the current data. Finally, we propose a nomenclature to facilitate further discussions of earthquake rupture predictability and determinism. [ABSTRACT FROM AUTHOR]- Published
- 2021
- Full Text
- View/download PDF
11. More Fault Connectivity Is Needed in Seismic Hazard Analysis.
- Author
-
Page, Morgan T.
- Abstract
Did the third Uniform California Earthquake Rupture Forecast (UCERF3) go overboard with multifault ruptures? Schwartz (2018) argues that there are too many long ruptures in the model. Here, I address his concern and show that the UCERF3 rupture-length distribution matches empirical data. I also present evidence that, if anything, the UCERF3 model could be improved by adding more connectivity to the fault system. Adding more connectivity would improve model misfits with data, particularly with paleoseismic data on the southern San Andreas fault; make the model less characteristic on the faults; potentially improve aftershock forecasts; and reduce model sensitivity to inadequacies and unknowns in the modeled fault system. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
12. Generalizing the Inversion-Based PSHA Source Model for an Interconnected Fault System.
- Author
-
Field, Edward H., Milner, Kevin R., and Page, Morgan T.
- Abstract
This article represents a step toward generalizing and simplifying the procedure for constructing an inversion-based seismic hazard source model for an interconnected fault system, including the specification of adjustable segmentation constraints. A very simple example is used to maximize understandability and to counter the notion that an inversion approach is only applicable when an abundance of data is available. Also exemplified is how to construct a range of models to adequately represent epistemic uncertainties (which should be a high priority in any hazard assessment). Opportunity is also taken to address common concerns and misunderstandings associated with the third Uniform California Earthquake Rupture Forecast, including the seemingly disproportionate number of large-magnitude events, and how well hazard is resolved given the overall problem is very underdetermined. However, the main aim of this article is to provide a general protocol for constructing such models. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
13. Revisiting California's Past Great Earthquakes and Long-Term Earthquake Rate.
- Author
-
Hough, Susan E., Page, Morgan, Salditch, Leah, Gallahue, Molly M., Lucas, Madeleine C., Neely, James S., and Stein, Seth
- Abstract
In this study, we revisit the three largest historical earthquakes in California--the 1857 Fort Tejon, 1872 Owens Valley, and 1906 San Francisco earthquakes--to review their published moment magnitudes, and compare their estimated shaking distributions with predictions using modern ground-motion models (GMMs) and ground-motion intensity conversion equations. Currently accepted moment magnitude estimates for the three earthquakes are 7.9, 7.6, and 7.8, respectively. We first consider the extent to which the intensity distributions of all three earthquakes are consistent with a moment magnitude toward the upper end of the estimated range. We then apply a GMM-based method to estimate the magnitudes of large historical earthquakes. The intensity distribution of the 1857 earthquake is too sparse to provide a strong constraint on magnitude. For the 1872 earthquake, consideration of all available constraints suggests that it was a high stress-drop event, with a magnitude on the higher end of the range implied by scaling relationships, that is, higher than moment magnitude 7.6. For the 1906 earthquake, based on our analysis of regional intensities and the detailed intensity distribution in San Francisco, along with other available constraints, we estimate a preferred moment magnitude of 7.9, consistent with the published estimate based on geodetic and instrumental seismic data. These results suggest that, although there can be a tendency for historical earthquake magnitudes to be overestimated, the accepted catalog magnitudes of California's largest historical earthquakes could be too low. Given the uncertainties of the magnitude estimates, the seismic moment release rate between 1850 and 2019 could have been either higher or lower than the average over millennial time scales. It is further not possible to reject the hypothesis that California seismicity is described by an untruncated Gutenberg-Richter distribution with a b-value of 1.0 for moment magnitudes up to 8.0. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
14. Ms. Adichie: There's no single story on trans women
- Author
-
Page, Morgan M.
- Subjects
Dear Ijeawele, or A Feminist Manifesto in Fifteen Suggestions (Nonfiction work) -- Authorship ,Writers -- Interviews ,General interest ,News, opinion and commentary - Abstract
Byline: MORGAN M. PAGE; Special to The Globe and Mail Who has the power to decide what stories get told? This is one of the central questions acclaimed Nigerian writer [...]
- Published
- 2017
15. The Wondrous and the Wicked
- Author
-
Page Morgan and Page Morgan
- Subjects
- Demonology--Juvenile fiction, Gargoyles--Juvenile fiction, Paranormal fiction, Sisters--Juvenile fiction
- Abstract
For fans of Lauren Kate's Fallen series comes the exciting conclusion to the trilogy that includes The Beautiful and the Cursed and The Lovely and the Lost. The Waverly sisters must save themselves before all is lost. Since the Waverlys arrived in Paris, the streets have grown more fearsome by the day. As Ingrid learns to master her lectrux gift, she must watch Axia's power grow strong enough to extend beyond her Underneath hive. By all indications, the fallen angel's Harvest is near--and the timing couldn't be worse. Targeted by vengeful gargoyles, Gabby has been exiled to London for her own protection. Meanwhile, the gargoyle castes are in disarray, divided between those who want Luc to lead them and those who resent him and his fondness for humans. The Alliance is crumbling from the inside as well, its members turning against one another, and possibly against the Waverlys, too. Axia has promisted that the world will burn. An now, unable to trust the Alliance, separated from Luc, Gabby, and her twin, Grayson, Ingrid is left to face the demon uprising alone.Praise for The Wondrous and the Wicked:“A satisfying, exciting wrap-up to an immersive paranormal series.”—SLJPraise for the Dispossessed Trilogy:'A deliciously satisfying mix of historical fiction, mystery, and supernatural romance.'-The Bulletin'Morgan combines fantasy with gothic romance in this well-crafted standout.'-Booklist'Forbidden romance and hot kissing abound.'-Kirkus Reviews'Morgan keeps the plot moving with constant action...dark adventure and romance.'-SLJ'Morgan's fluid descriptions, inventive otherworldly elements, and characters with convincing motivations result in an immersive first installment.'-Publishers Weekly
- Published
- 2015
16. The Lovely and the Lost
- Author
-
Page Morgan and Page Morgan
- Subjects
- Children's stories
- Abstract
For readers of Lauren Kate's Fallen series comes the sequel to The Beautiful and the Cursed,The Lovely and the Lost finds the Waverly sisters in mortal danger and able to trust no one. Ingrid and Gabby Waverly moved to France expecting a quiet reprieve from London gossip, but the truth they face in their new home has a sharper--and deadlier--sting. Paris is plagued by an underworld of demons and gargoyles who all seem to want something from the Waverly girls. Saving Ingrid's twin, Grayson, from the fallen angel Axia nearly killed them. And they're still being hunted--only this time, demons aren't their only predators. Ingrid's blood is special: it bestows the power to command gargoyles. It's an ability no other human has, and in the wrong hands, it could be used to send her cursed guardian, Luc, and his fellow Dispossessed to extinction. There are those who will do anything to get Ingrid's blood--and they see no value in human life. The Alliance has vowed to protect the Waverlys, and a new gargoyle has been assigned to guard their abbey home alongside Luc. But no one can watch over Ingrid, Gabby, and Grayson all the time--which means the three must learn to fight for themselves. Because darkness follows the Waverlys. And sometimes darkness comes in the form you trust the most.Praise for the Dispossessed Trilogy:“A deliciously satisfying mix of historical fiction, mystery, and supernatural romance.”—The Bulletin“Morgan combines fantasy with gothic romance in this well-crafted standout.”—Booklist “Forbidden romance and hot kissing abound.”—Kirkus Reviews“Morgan keeps the plot moving with constant action…dark adventure and romance.”—School Library Journal“Morgan's fluid descriptions, inventive otherworldly elements, and characters with convincing motivations result in an immersive first installment.”—Publishers Weekly
- Published
- 2014
17. Peak Ground Displacement Saturates Exactly When Expected: Implications for Earthquake Early Warning.
- Author
-
Trugman, Daniel T., Page, Morgan T., Cochran, Elizabeth S., and Minson, Sarah E.
- Subjects
- *
EARTHQUAKES , *EARTHQUAKE hazard analysis , *GEOPHYSICS , *SEISMOLOGY - Abstract
The scaling of rupture properties with magnitude is of critical importance to earthquake early warning systems that rely on source characterization using limited snapshots of waveform data. ShakeAlert, a prototype earthquake early warning system that is being developed for the western United States, provides real‐time estimates of earthquake magnitude based on P wave peak ground displacements measured at stations triggered by the event. The algorithms used in ShakeAlert assume that the displacement measurements at each station are statistically independent and that there exists a linear and time‐independent relation between log peak ground displacement and earthquake magnitude. Here we challenge this basic assumption using the largest data set assembled for this purpose to date: a comprehensive database of more than 140,000 vertical‐component waveforms from M4.5 to M9 earthquakes occurring near Japan from 1997 through 2018 and recorded by the K‐NET and KiK‐net strong‐motion networks. By analyzing the time evolution of P wave peak ground displacements for these earthquakes, we show that there is a break, or saturation, in the magnitude‐displacement scaling that depends on the length of the measurement time window. We demonstrate that the magnitude at which this saturation occurs is well‐explained by a simple and nondeterministic model of earthquake rupture growth. We then use the predictions of this saturation model to develop a Bayesian framework for estimating posterior uncertainties in real‐time magnitude estimates. Key Points: We analyze P wave peak displacements (Pd) of magnitude M4.5‐9 earthquakes in Japan from 1997 to 2018Time‐dependent saturation in the linear scaling between log10 Pd and magnitude is consistent with nondeterministic ruptureWe develop a Bayesian framework for rapid calculations of time‐dependent uncertainties in real‐time magnitude estimates [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
18. The Beautiful and the Cursed: Marco's Story
- Author
-
Page Morgan and Page Morgan
- Subjects
- Paranormal fiction, Gargoyles--Fiction, Sisters--Fiction
- Abstract
For readers of Lauren Kate's Fallen series comes a digital original short story set in the world of The Beautiful and the Cursed that follows Ingrid and Gabby Waverly and the terrifying forces determined to take their lives. Read it before the sequel, The Lovely and the Lost, is available in 2014 from Delacorte Press.Praise for The Beautiful and the Cursed:“A deliciously satisfying mix of historical fiction, mystery, and supernatural romance.”—The Bulletin“Morgan's fluid descriptions, inventive otherworldly elements, and characters with convincing motivations result in an immersive first installment.”—Publisher's Weekly“Morgan combines fantasy with gothic romance in this well-crafted standout.”—Booklist “A sexy red dress…forbidden romance and hot kissing abound.”—Kirkus Reviews“Morgan keeps the plot moving with constant action…dark adventure and romance.”—School Library Journal
- Published
- 2013
19. The Beautiful and the Cursed
- Author
-
Page Morgan and Page Morgan
- Subjects
- Supernatural--Fiction, Gargoyles--Fiction, Sisters--Fiction
- Abstract
Fans of Cassandra Clare's Mortal Instruments series and Lauren Kate's Fallen novels will devour The Beautiful and the Cursed, a wholly original interpretation of gargoyle lore. After a bizarre accident, Ingrid Waverly is forced to leave London with her mother and her younger sister, Gabby, trading a world full of fancy dresses and society events for the unfamiliar city of Paris. In Paris there are no grand balls or glittering parties for Ingrid, and, disturbingly, the house her twin brother, Grayson, was sent ahead to secure for the family isn't a house at all. It's an abandoned abbey, its roof lined with stone gargoyles that could almost be mistaken for living, breathing creatures. And Grayson is missing. Yet no one seems worried about his whereabouts save for Luc, a devastatingly handsome servant at their new home. Ingrid is sure her twin isn't dead--she can feel it deep in her soul--but she knows he's in grave danger, and that it's up to her and Gabby to find him before all hope is lost. The path to Grayson will be twisted, leading Ingrid to discover dark secrets and otherworldly truths that, once uncovered, can never again be buried.Praise for the Dispossessed Trilogy:“A deliciously satisfying mix of historical fiction, mystery, and supernatural romance.”—The Bulletin“Morgan combines fantasy with gothic romance in this well-crafted standout.”—Booklist “Forbidden romance and hot kissing abound.”—Kirkus Reviews“Morgan keeps the plot moving with constant action…dark adventure and romance.”—School Library Journal“Morgan's fluid descriptions, inventive otherworldly elements, and characters with convincing motivations result in an immersive first installment.”—Publishers Weekly
- Published
- 2013
20. 7 Steps to better EDM mixes
- Author
-
Page, Morgan
- Subjects
Business ,General interest ,Business, international ,News, opinion and commentary - Abstract
'Fight for You,' the lead single of my new album, is a great example of how you can stack keyboards to create a rich sound. Using only one synth is [...]
- Published
- 2009
21. Turing-Style Tests for UCERF3 Synthetic Catalogs.
- Author
-
Page, Morgan T. and van der Elst, Nicholas J.
- Abstract
Epidemic-type aftershock sequence (ETAS) catalogs generated from the third Uniform California Earthquake Rupture Forecast (UCERF3) model are unique in that they are the first to combine a complex, fault-based long-term forecast with short-term earthquake clustering statistics. We present Turing-style tests to examine whether these synthetic catalogs can successfully imitate observed earthquake behavior in California. We find that UCERF3-ETAS is more spatially diffuse than the observed historic catalog in California and that it lacks quiet periods that are present in the real catalog. Although mean aftershock productivity of the observed catalog is matched closely by UCERF3-ETAS, the real catalog has more intersequence productivity variability and small mainshocks have more foreshocks. In sum, we find that UCERF3-ETAS differs from the observed catalog in ways that are foreseeable from its modeling simplifications. The tests we present here can be used on any model that produces suites of synthetic catalogs; as such, in addition to providing avenues for future improvements to the model, they could be incorporated into testing platforms such as Collaboratory for the Study of Earthquake Predictability (CSEP). [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
22. Distinguishing barriers and asperities in near-source ground motion
- Author
-
Page, Morgan T, Dunham, E M, and Carlson, J M
- Subjects
barriers ,ground motion ,asperities ,Physics::Geophysics - Abstract
We investigate the ground motion produced by rupture propagation through circular barriers and asperities in an otherwise homogeneous earthquake rupture. Using a three-dimensional finite difference method, we analyze the effect of asperity radius, strength, and depth in a dynamic model with fixed rupture velocity. We gradually add complexity to the model, eventually approaching the behavior of a spontaneous dynamic rupture, to determine the origin of each feature in the ground motion. A barrier initially resists rupture, which induces rupture front curvature. These effects focus energy on and off the fault, leading to a concentrated pulse from the barrier region and higher velocities at the surface. Finally, we investigate the scaling laws in a spontaneous dynamic model. We find that dynamic stress drop determines fault-parallel static offset, while the time it takes the barrier to break is a measure of fracture energy. Thus, given sufficiently strong heterogeneity, the prestress and yield stress (relative to sliding friction) of the barrier can both be determined from ground motion measurements. In addition, we find that models with constraints on rupture velocity have less ground motion than constraint-free spontaneous dynamic models with equivalent stress drops. This suggests that kinematic models with such constraints overestimate the actual stress heterogeneity of earthquakes.
- Published
- 2005
23. A Spatiotemporal Clustering Model for the Third Uniform California Earthquake Rupture Forecast (UCERF3-ETAS): Toward an Operational Earthquake Forecast.
- Author
-
Field, Edward H., Milner, Kevin R., Hardebeck, Jeanne L., Page, Morgan T., van der Elst, Nicholas, Jordan, Thomas H., Michael, Andrew J., Shaw, Bruce E., and Werner, Maximilian J.
- Abstract
We, the ongoing Working Group on California Earthquake Probabilities, present a spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3), with the goal being to represent aftershocks, induced seismicity, and otherwise triggered events as a potential basis for operational earthquake forecasting (OEF). Specifically, we add an epidemic-type aftershock sequence (ETAS) component to the previously published time-independent and long-term time-dependent forecasts. This combined model, referred to as UCERF3-ETAS, collectively represents a relaxation of segmentation assumptions, the inclusion of multifault ruptures, an elastic-rebound model for fault-based ruptures, and a state-of-the-art spatiotemporal clustering component. It also represents an attempt to merge fault-based forecasts with statistical seismology models, such that information on fault proximity, activity rate, and time since last event are considered in OEF. We describe several unanticipated challenges that were encountered, including a need for elastic rebound and characteristic magnitude-frequency distributions (MFDs) on faults, both of which are required to get realistic triggering behavior. UCERF3-ETAS produces synthetic catalogs of M=2.5 events, conditioned on any prior M=2.5 events that are input to the model. We evaluate results with respect to both long-term (1000 year) simulations as well as for 10-year time periods following a variety of hypothetical scenario mainshocks. Although the results are very plausible, they are not always consistent with the simple notion that triggering probabilities should be greater if a mainshock is located near a fault. Important factors include whether the MFD near faults includes a significant characteristic earthquake component, as well as whether large triggered events can nucleate from within the rupture zone of the mainshock. Because UCERF3-ETAS has many sources of uncertainty, as will any subsequent version or competing model, potential usefulness needs to be considered in the context of actual applications. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
24. Potentially Induced Earthquakes during the Early Twentieth Century in the Los Angeles Basin.
- Author
-
Hough, Susan E. and Page, Morgan
- Abstract
Recent studies have presented evidence that early to mid-twentieth-century earthquakes in Oklahoma and Texas were likely induced by fossil fuel production and/or injection of wastewater (Hough and Page, 2015; Frohlich et al., 2016). Considering seismicity from 1935 onward, Hauksson et al. (2015) concluded that there is no evidence for significant induced activity in the greater Los Angeles region between 1935 and the present. To explore a possible association between earthquakes prior to 1935 and oil and gas production, we first revisit the historical catalog and then review contemporary oil industry activities. Although early industry activities did not induce large numbers of earthquakes, we present evidence for an association between the initial oil boom in the greater Los Angeles area and earthquakes between 1915 and 1932, including the damaging 22 June 1920 Inglewood and 8 July 1929 Whittier earthquakes. We further consider whether the 1933 Mw 6.4 Long Beach earthquake might have been induced, and show some evidence that points to a causative relationship between the earthquake and activities in the Huntington Beach oil field. The hypothesis that the Long Beach earthquake was either induced or triggered by an foreshock cannot be ruled out. Our results suggest that significant earthquakes in southern California during the early twentieth century might have been associated with industry practices that are no longer employed (i.e., production without water reinjection), and do not necessarily imply a high likelihood of induced earthquakes at the present time. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
25. PLETHODON ALBAGULA.
- Author
-
PAGE, MORGAN and MURRAY, ALEXANDER
- Abstract
The article informs that the first observation of a spotless Plethodon albagula, a medium-sized lungless salamander typically found with light-colored spotting across its dorsal region, was made in the Barton Creek Habitat Preserve in Austin, Texas, U.S.
- Published
- 2022
26. Three Ingredients for Improved Global Aftershock Forecasts: Tectonic Region, Time-Dependent Catalog Incompleteness, and Intersequence Variability.
- Author
-
Page, Morgan T., van der Elst, Nicholas, Hardebeck, Jeanne, Felzer, Karen, and Michael, Andrew J.
- Abstract
Following a large earthquake, seismic hazard can be orders of magnitude higher than the long-term average as a result of aftershock triggering. Because of this heightened hazard, emergency managers and the public demand rapid, authoritative, and reliable aftershock forecasts. In the past, U.S. Geological Survey (USGS) aftershock forecasts following large global earthquakes have been released on an ad hoc basis with inconsistent methods, and in some cases aftershock parameters adapted from California. To remedy this, the USGS is currently developing an automated aftershock product based on the Reasenberg and Jones (1989) method that will generate more accurate forecasts. To better capture spatial variations in aftershock productivity and decay, we estimate regional aftershock parameters for sequences within the García et al. (2012) tectonic regions. We find that regional variations for mean aftershock productivity reach almost a factor of 10. We also develop a method to account for the time-dependent magnitude of completeness following large events in the catalog. In addition to estimating average sequence parameters within regions, we develop an inverse method to estimate the intersequence parameter variability. This allows for a more complete quantification of the forecast uncertainties and Bayesian updating of the forecast as sequence-specific information becomes available. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
27. Induced earthquake magnitudes are as large as (statistically) expected.
- Author
-
Elst, Nicholas J., Page, Morgan T., Weiser, Deborah A., Goebel, Thomas H.W., and Hosseini, S. Mehran
- Published
- 2016
- Full Text
- View/download PDF
28. A Century of Induced Earthquakes in Oklahoma?
- Author
-
Hough, Susan E. and Page, Morgan
- Subjects
INDUCED seismicity ,EARTH movements ,SEISMOLOGY measurements ,STATISTICAL methods of seismometry ,MATHEMATICAL seismology ,STRUCTURAL geology - Abstract
Seismicity rates have increased sharply since 2009 in the central and eastern United States, with especially high rates of activity in the state of Oklahoma. Growing evidence indicates that many of these events are induced, primarily by injection of wastewater in deep disposal wells. The upsurge in activity has raised two questions: What is the background rate of tectonic earthquakes in Oklahoma? How much has the rate varied throughout historical and early instrumental times? In this article, we show that (1) seismicity rates since 2009 surpass previously observed rates throughout the twentieth century; (2) several lines of evidence suggest that most of the significant earthquakes in Oklahoma during the twentieth century were likely induced by oil production activities, as they exhibit statistically significant temporal and spatial correspondence with disposal wells, and intensity measurements for the 1952 El Reno earthquake and possibly the 1956 Tulsa County earthquake follow the pattern observed in other induced earthquakes; and (3) there is evidence for a low level of tectonic seismicity in southeastern Oklahoma associated with the Ouachita structural belt. The 22 October 1882 Choctaw Nation earthquake, for which we estimate Afw 4.8, occurred in this zone. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
29. Southern San Andreas Fault Seismicity is Consistent with the Gutenberg-Richter Magnitude-Frequency Distribution.
- Author
-
Page, Morgan and Felzer, Karen
- Subjects
EARTHQUAKE magnitude ,DISTRIBUTION (Probability theory) ,PALEOSEISMOLOGY ,GEOLOGIC faults ,EXTRAPOLATION - Abstract
The magnitudes of any collection of earthquakes nucleating in a region are generally observed to follow the Gutenberg-Richter (GR) distribution. On some major faults, however, paleoseismic rates are higher than a GR extrapolation from the modem rate of small earthquakes would predict. This, along with other observations, led to the formulation of the characteristic earthquake hypothesis, which holds that the rate of small-to-moderate earthquakes is permanently low on large faults relative to the large-earthquake rate (Wesnousky et al., 1983; Schwartz and Coppersmith, 1984). We examine the rate difference between recent small-to-moderate earthquakes on the southern San Andreas fault (SSAF) and the paleoseismic record, hypothesizing that the discrepancy can be explained as a rate change in time rather than a deviation from GR statistics. We find that with reasonable assumptions, the rate changes necessary to bring the small and large earthquake rates into alignment agree with the size of rate changes seen in epidemic-type aftershock sequence modeling, where aftershock triggering of large earthquakes drives strong fluctuations in the seismicity rates for earthquakes of all magnitudes. The necessary rate changes are also comparable to rate changes observed for other faults worldwide. These results are consistent with paleoseismic observations of temporally clustered bursts of large earthquakes on the SSAF and the absence of M ≥7 earthquakes on the SSAF since 1857. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
30. Long-Term Time-Dependent Probabilities for the Third Uniform California Earthquake Rupture Forecast (UCERF3).
- Author
-
Field, Edward H., Biasi, Glenn P., Bird, Peter, Dawson, Timothy E., Felzer, Karen R., Jackson, David D., Johnson, Kaj M., Jordan, Thomas H., Madden, Christopher, Michael, Andrew J., Milner, Kevin R., Page, Morgan T., Parsons, Tom, Powers, Peter M., Shaw, Bruce E., Thatcher, Wayne R., Weldon II, Ray J., and Yuehua Zeng
- Subjects
SURFACE fault ruptures ,EARTHQUAKE prediction ,PROBABILITY theory ,ELASTIC waves ,SENSITIVITY analysis - Abstract
The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-independent model published previously, renewal models are utilized to represent elasticrebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for unsegmented models. The new methodology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30 yr M ≥6.7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault-slip rates), with relaxation of segmentation and inclusion of multifault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M 6.7 size events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative importance of logic-tree branches, vary throughout the region and depend on the evaluation metric of interest. For example, M ≥6.7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the approximate nature of the model and known limitations, means the applicability of UCERF3 should be evaluated on a case-by-case basis. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
31. Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3) --The Time-Independent Model.
- Author
-
Field, Edward H., Arrowsmith, Ramon J., Biasi, Glenn P., Bird, Peter, Dawson, Timothy E., Felzer, Karen R., Jackson, David D., Johnson, Kaj M., Jordan, Thomas H., Madden, Christopher, Michael, Andrew J., Milner, Kevin R., Page, Morgan T., Parsons, Tom, Powers, Peter M., Shaw, Bruce E., Thatcher, Wayne R., Weldon II, Ray J., and Yuehua Zeng
- Subjects
EARTHQUAKE prediction ,SURFACE fault ruptures ,SIMULATED annealing ,SEISMIC event location ,GEOLOGIC faults ,EARTHQUAKE magnitude - Abstract
The 2014 Working Group on California Earthquake Probabilities (WGCEP14) present the time-independent component of the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3), which provides authoritative estimates of the magnitude, location, and time-averaged frequency of potentially damaging earthquakes in California. The primary achievements have been to relax fault segmentation and include multifault ruptures, both limitations of UCERF2. The rates of all earthquakes are solved for simultaneously and from a broader range of data, using a system-level inversion that is both conceptually simple and extensible. The inverse problem is large and underdetermined, so a range of models is sampled using an efficient simulated annealing algorithm. The approach is more derivative than prescriptive (e.g., magnitude-frequency distributions are no longer assumed), so new analysis tools were developed for exploring solutions. Epistemic uncertainties were also accounted for using 1440 alternative logic-tree branches, necessitating access to supercomputers. The most influential uncertainties include alternative deformation models (fault slip rates), a new smoothed seismicity algo-rithm, alternative values for the total rate of M
w >5 events, and different scaling relationships, virtually all of which are new. As a notable first, three deformation models are based on kinematically consistent inversions of geodetic and geologic data, also providing slip-rate constraints on faults previously excluded due to lack of geologic data. The grand inversion constitutes a system-level framework for testing hypotheses and balancing the influence of different experts. For example, we demonstrate serious challenges with the Gutenberg-Richter hypothesis for individual faults. UCERF3 is still an approximation of the system, however, and the range of models is limited (e.g., constrained to stay close to UCERF2). Never-theless, UCERF3 removes the apparent UCERF2 overprediction of M 6.5-7 earth-quake rates and also includes types of multifault ruptures seen in nature. Although UCERF3 fits the data better than UCERF2 overall, there may be areas that warrant further site-specific investigation. Supporting products may be of general interest, and we list key assumptions and avenues for future model improvements. [ABSTRACT FROM AUTHOR]- Published
- 2014
- Full Text
- View/download PDF
32. The UCERF3 Grand Inversion: Solving for the Long-Term Rate of Ruptures in a Fault System.
- Author
-
Page, Morgan T., Field, Edward H., Milner, Kevin R., and Powers, Peter M.
- Subjects
GEOLOGIC faults ,SURFACE fault ruptures ,EARTHQUAKE hazard analysis ,INVERSIONS (Geology) ,PALEOSEISMOLOGY ,EARTHQUAKE magnitude ,SIMULATED annealing - Abstract
We present implementation details, testing, and results from a new in-version-based methodology, known colloquially as the "grand inversion," developed for the Uniform California Earthquake Rupture Forecast (UCERF3). We employ a parallel simulated annealing algorithm to solve for the long-term rate of all ruptures that extend through the seismogenic thickness on major mapped faults in California while simultaneously satisfying available slip-rate, paleoseismic event-rate, and magnitude-distribution constraints. The inversion methodology enables the relaxation of fault segmentation and allows for the incorporation of multifault ruptures, which are needed to remove magnitude-distribution misfits that were present in the previous model, UCERF2. The grand inversion is more objective than past methodologies, as it eliminates the need to prescriptively assign rupture rates. It also provides a means to easily update the model as new data become available. In addition to UCERF3 model results, we present verification of the grand inversion, including sensitivity tests, tuning of equation set weights, convergence metrics, and a synthetic test. These tests demonstrate that while individual rupture rates are poorly resolved by the data, in-tegrated quantities such as magnitude-frequency distributions and, most importantly, hazard metrics, are much more robust. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
33. Possible Earthquake Rupture Connections on Mapped California Faults Ranked by Calculated Coulomb Linking Stresses.
- Author
-
Parsons, Tom, Field, Edward H., Page, Morgan T., and Milner, Kevin
- Subjects
SURFACE fault ruptures ,EARTHQUAKE hazard analysis ,STRAINS & stresses (Mechanics) ,DYNAMIC simulation ,EARTHQUAKE magnitude - Abstract
Probabilistic seismic hazard assessment is requiring an increasingly broad compilation of earthquake sources. Fault systems are often divided into characteristic ruptures based on geometric features such as bends or steps, though events such as the 2002 M 7.9 Denali, and 2011 M 9.0 Tohoku-Oki earthquakes raise the possibility that earthquakes can involve subsidiary faults and/or rupture through identified geometric barriers. Here we introduce a method to discriminate among a wide range of possible earthquakes within a large fault system and to quantify the probability of a rupture passing through a bend or step. We note that many of the conditions favoring earthquake rupture propagation can be simulated using a static Coulomb stress change approximation. Such an approach lacks inertial effects inherent in a full dynamic simulation but does capture many of the empirical observations drawn from examining past ruptures, such as continuity of rake and strike, as well as distance across gaps or stepovers. We make calculations for a test region in northern California and find that the method provides a quantitative basis for ranking possible ruptures within localized fault systems. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
34. The magnitude distribution of earthquakes near Southern California faults.
- Author
-
Page, Morgan T., Alderson, David, and Doyle, John
- Published
- 2011
- Full Text
- View/download PDF
35. Toward a consistent model for strain accrual and release for the New Madrid Seismic Zone, central United States.
- Author
-
Hough, Susan E. and Page, Morgan
- Published
- 2011
- Full Text
- View/download PDF
36. Estimating Earthquake-Rupture Rates on a Fault or Fault System.
- Author
-
Field, Edward H. and Page, Morgan T.
- Subjects
EARTHQUAKES ,INVERSION (Geophysics) ,PALEOSEISMOLOGY ,METHODOLOGY ,GEOLOGIC faults - Abstract
Previous approaches used to determine the rates of different earthquakes on a fault have made assumptions regarding segmentation, have been difficult to document and reproduce, and have lacked the ability to satisfy all available data constraints. We present a relatively objective and reproducible inverse methodology for determining the rate of different ruptures on a fault or fault system. The data used in the inversion include slip rate, event rate, and other constraints such as an optional a priori magnitude-frequency distribution. We demonstrate our methodology by solving for the long-term rate of ruptures on the southern San Andreas fault. Our results imply that a Gutenberg-Richter distribution is consistent with the data available for this fault; however, more work is needed to test the robustness of this assertion. More importantly, the methodology is extensible to an entire fault system (thereby including multifault ruptures) and can be used to quantify the relative benefits of collecting additional paleoseismic data at different sites. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
37. Constraining earthquake source inversions with GPS data: 2. A two-step approach to combine seismic and geodetic data sets.
- Author
-
Custódio, Susana, Page, Morgan T., and Archuleta, Ralph J.
- Published
- 2009
- Full Text
- View/download PDF
38. Constraining earthquake source inversions with GPS data: 1. Resolution-based removal of artifacts.
- Author
-
Page, Morgan T., Custódio, Susana, Archuleta, Ralph J., and Carlson, J. M.
- Published
- 2009
- Full Text
- View/download PDF
39. Effects of Large-Scale Surface Topography on Ground Motions, as Demonstrated by a Study of the San Gabriel Mountains, Los Angeles, California.
- Author
-
Ma, Shuo, Archuleta, Ralph J., and Page, Morgan T.
- Subjects
EARTH movements ,EARTHQUAKES ,GEOLOGIC faults ,SURFACE waves (Fluids) ,SEISMIC wave velocity - Abstract
We investigate the effects of large-scale surface topography on ground motions generated by nearby faulting. We show a specific example studying the effect of the San Gabriel Mountains, which are bounded by the Mojave segment of the San Andreas fault on the north and by the Los Angeles Basin on the south. By simulating a M
w 7.5 earthquake on the Mojave segment of the San Andreas fault, we show that the San Gabriel Mountains act as a natural seismic insulator for metropolitan Los Angeles. The topography of the mountains scatters the surface waves generated by the rupture on the San Andreas fault, leading to less-efficient excitation of basin-edge generated waves and natural resonances within the Los Angeles Basin. The effect of the mountains reduces the peak amplitude of ground velocity for some regions in the basin by as much as 50% in the frequency band up to 0.5 Hz. These results suggest that, depending on the relative location of faulting and the nearby large-scale topography, the topography can shield some areas from ground shaking. [ABSTRACT FROM AUTHOR]- Published
- 2007
- Full Text
- View/download PDF
40. Methodologies for Earthquake Hazard Assessment: Model Uncertainty and the WGCEP-2002 Forecast. .
- Author
-
Page, Morgan T. and Carlson, J. M.
- Subjects
EARTHQUAKE hazard analysis ,EARTHQUAKE engineering ,CIVIL engineering ,ENGINEERING geology - Abstract
Model uncertainty is prevalent in probabilistic seismic hazard analysis (PSHA) because the underlying statistical signatures for hazard are unknown. Although methods for incorporating parameter uncertainty of a particular model in PSHA are well understood, methods for incorporating model uncertainty are more difficult to implement because of the high degree of dependence between different earthquake-recurrence models. We show that the method used by the 2002 Working Group on California Earthquake Probabilities (WGCEP-2002) to combine the probability distributions given by multiple earthquake-recurrence models has several adverse effects on their results. In particular, WGCEP-2002 uses a linear combination of the models that ignores model dependence and leads to large uncertainty in the final hazard estimate. Furthermore, model weights were chosen based on data, which has the potential to systematically bias the final probability distribution. The weighting scheme used in the Working Group report also produces results that depend on an arbitrary ordering of models. In addition to analyzing current statistical problems, we present alternative methods for rigorously incorporating model uncertainty into PSHA. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
41. Distinguishing barriers and asperities in near-source ground motion.
- Author
-
Page, Morgan T., Dunham, Eric M., and Carlson, J. M.
- Published
- 2005
- Full Text
- View/download PDF
42. HEADSCAN.
- Author
-
Page, Morgan, Donna, Donna M., and Sinopoli, Dayna
- Published
- 2019
43. Artificial seismic acceleration.
- Author
-
Felzer, Karen R., Page, Morgan T., and Michael, Andrew J.
- Subjects
- *
EARTHQUAKES , *SEISMOLOGY - Abstract
A letter to the editor is presented in response to a 2013 article regarding a significant acceleration of seismicity before magnitude 6.5 mainshock earthquakes that occur in interplate regions.
- Published
- 2015
- Full Text
- View/download PDF
44. A multifault earthquake threat for the Seattle metropolitan region revealed by mass tree mortality.
- Author
-
Black, Bryan A., Pearl, Jessie K., Pearson, Charlotte L., Pringle, Patrick T., Frank, David C., Page, Morgan T., Buckley, Brendan M., Cook, Edward R., Harley, Grant L., King, Karen J., Hughes, Jonathan F., Reynolds, David J., and Sherrod, Brian L.
- Subjects
- *
TREE mortality , *EARTHQUAKES - Abstract
The article highlights the challenges in identifying compound earthquakes resulting from simultaneous ruptures of multiple fault zones due to the lack of historical benchmarks and dating uncertainties in geological records. It discusses a study in Puget Sound region of western Washington, USA, where dendrochronological analysis of earthquake-killed trees helped precisely determine the timing of an earthquake cluster approximately 1100 years ago, shedding light on the dynamics and seismic risks.
- Published
- 2023
- Full Text
- View/download PDF
45. Testing Earthquake Source Inversion Methodologies.
- Author
-
Page, Morgan, Mai, P. Martin, and Schorlemmer, Danijel
- Published
- 2011
- Full Text
- View/download PDF
46. Breaking Badly : Forecasting California Earthquakes
- Author
-
Page, Morgan
- Abstract
In this video, Morgan Page, Research Geophysicist at the USGS, discusses how scientists cannot currently predict the precise time, location and size of future damaging earthquakes.
- Published
- 2015
47. The earthquake-source inversion Validation (SIV) project
- Author
-
Wenyuan Fan, Haruko Sekiguchi, Luca Passone, Jagdish Chandra Vyas, Cedric Twardzik, Martin Käser, K. K. S. Thingbaijam, Mathieu Causse, Fred F. Pollitz, Martin Galis, Gaetano Festa, Jean-Paul Ampuero, Martin van Driel, Walter Imperatori, Danijel Schorlemmer, Dmytro Malytskyy, Olaf Zielke, Surendra Nadh Somala, Seok Goo Song, Kimiyuki Asano, Yuji Yagi, Morgan T. Page, Ryo Okuwaki, František Gallovič, Hoby N. T. Razafindrakoto, Rongjiang Wang, P. Martin Mai, Susana Custódio, Mai, P. Martin, Schorlemmer, Danijel, Page, Morgan, Ampuero, Jean Paul, Asano, Kimiyuki, Causse, Mathieu, Custodio, Susana, Fan, Wenyuan, Festa, Gaetano, Galis, Martin, Gallovic, Frantisek, Imperatori, Walter, Käser, Martin, Malytskyy, Dmytro, Okuwaki, Ryo, Pollitz, Fred, Passone, Luca, Razafindrakoto, Hoby N. T., Sekiguchi, Haruko, Song, Seok Goo, Somala, Surendra N., Thingbaijam, Kiran K. S., Twardzik, Cedric, Van Driel, Martin, Vyas, Jagdish C., Wang, Rongjiang, Yagi, Yuji, Zielke, Olaf, GeoForschungsZentrum - Helmholtz-Zentrum Potsdam (GFZ), Institut des Sciences de la Terre (ISTerre), Institut Français des Sciences et Technologies des Transports, de l'Aménagement et des Réseaux (IFSTTAR)-Institut national des sciences de l'Univers (INSU - CNRS)-Institut de recherche pour le développement [IRD] : UR219-Université Savoie Mont Blanc (USMB [Université de Savoie] [Université de Chambéry])-Centre National de la Recherche Scientifique (CNRS)-Université Grenoble Alpes [2016-2019] (UGA [2016-2019]), Département Géotechnique, Environnement, Risques naturels et Sciences de la terre (IFSTTAR/GERS), Institut Français des Sciences et Technologies des Transports, de l'Aménagement et des Réseaux (IFSTTAR)-Université de Lyon-PRES Université Nantes Angers Le Mans (UNAM)-PRES Université Paris-Est-PRES Université de Grenoble, Institut de Physique du Globe de Paris (IPGP), Université Pierre et Marie Curie - Paris 6 (UPMC)-Institut national des sciences de l'Univers (INSU - CNRS)-IPG PARIS-Université Paris Diderot - Paris 7 (UPD7)-Université de La Réunion (UR)-Centre National de la Recherche Scientifique (CNRS), Faculty of Mathematics, Physics and Informatics [Bratislava] (FMPH/UNIBA), Comenius University in Bratislava, Department für Geo-und Umweltwissenschaften [München], Ludwig-Maximilians-Universität München (LMU), US Geological Survey [Menlo Park], United States Geological Survey [Reston] (USGS), Disaster Prevention Research Institute (DPRI), Kyoto University [Kyoto], and German Research Centre for Geosciences - Helmholtz-Centre Potsdam (GFZ)
- Subjects
Engineering ,010504 meteorology & atmospheric sciences ,Inversion methods ,[SDU.STU]Sciences of the Universe [physics]/Earth Sciences ,Imaging problem ,010502 geochemistry & geophysics ,Source inversion ,computer.software_genre ,01 natural sciences ,Slip-rate function ,Soil structure interaction ,ComputingMilieux_MISCELLANEOUS ,0105 earth and related environmental sciences ,Collaborative software ,Sourcereceiver geometrie ,business.industry ,Inversion (meteorology) ,Multiple source ,Quantitative waveform comparisons. tables of scalar source parameters and dissimilarity value ,Geophysics ,Input and inverted rupture model ,Figures of velocity-density structure ,Data mining ,Artificial intelligence ,business ,computer ,Strengths and weaknesses - Abstract
Finite-fault earthquake source inversions infer the (time-dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake-source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward-modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source-model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake-source imaging problem.
- Published
- 2016
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.