57 results on '"Steven N. Ward"'
Search Results
2. Tsunami Squares modelling of the 2015 June 24 Hongyanzi landslide generated river tsunami in Three Gorges Reservoir, China
- Author
-
Steven N. Ward, Lili Xiao, and Jiajia Wang
- Subjects
Hydrology ,Geophysics ,010504 meteorology & atmospheric sciences ,Geochemistry and Petrology ,Landslide ,010502 geochemistry & geophysics ,China ,01 natural sciences ,Geology ,0105 earth and related environmental sciences ,Three gorges - Published
- 2018
- Full Text
- View/download PDF
3. Correction: A Neolithic Mega-Tsunami Event in the Eastern Mediterranean: Prehistoric Settlement Vulnerability Along the Carmel Coast, Israel
- Author
-
Michael Lazar, Ehud Arkin-Shalev, Thomas E. Levy, Katrina Cantu, Steven N. Ward, Tammy M. Rittenour, Richard D Norris, Assaf Yasur-Landau, Gilad Shtienberg, Omri Gadol, Anthony Tamberino, and Biehl, Peter F
- Subjects
Topography ,Geologic Sediments ,History ,Luminescence ,010504 meteorology & atmospheric sciences ,Event (relativity) ,Optically Stimulated Luminescence ,Stone Age ,Social Sciences ,Marine and Aquatic Sciences ,Wetland ,010502 geochemistry & geophysics ,01 natural sciences ,Israel ,Holocene ,History, Ancient ,Seismology ,Luminescence Dating ,Sedimentary Geology ,geography.geographical_feature_category ,Quaternary Period ,Multidisciplinary ,Physics ,Electromagnetic Radiation ,Geology ,Geophysics ,Archaeology ,Neolithic Period ,Tsunamis ,Physical Sciences ,Medicine ,Research Article ,Freshwater Environments ,General Science & Technology ,Science ,Biophysics ,Ancient ,Prehistory ,Dosimetry ,Humans ,Petrology ,0105 earth and related environmental sciences ,geography ,Landforms ,Holocene Epoch ,Continental shelf ,Ecology and Environmental Sciences ,Correction ,Aquatic Environments ,Biology and Life Sciences ,Geologic Time ,Geomorphology ,Debris ,Archaeological Dating ,Wetlands ,Luminescent Measurements ,Period (geology) ,Earth Sciences ,Cenozoic Era ,Sediment ,Chronology - Abstract
Tsunami events in antiquity had a profound influence on coastal societies. Six thousand years of historical records and geological data show that tsunamis are a common phenomenon affecting the eastern Mediterranean coastline. However, the possible impact of older tsunamis on prehistoric societies has not been investigated. Here we report, based on optically stimulated luminescence chronology, the earliest documented Holocene tsunami event, between 9.91 to 9.29 ka (kilo-annum), from the eastern Mediterranean at Dor, Israel. Tsunami debris from the early Neolithic is composed of marine sand embedded within fresh-brackish wetland deposits. Global and local sea-level curves for the period, 9.91–9.29 ka, as well as surface elevation reconstructions, show that the tsunami had a run-up of at least ~16 m and traveled between 3.5 to 1.5 km inland from the palaeo-coastline. Submerged slump scars on the continental slope, 16 km west of Dor, point to the nearby “Dor-complex” as a likely cause. The near absence of Pre-Pottery Neolithic A-B archaeological sites (11.70–9.80 cal. ka) suggest these sites were removed by the tsunami, whereas younger, late Pre-Pottery Neolithic B-C (9.25–8.35 cal. ka) and later Pottery-Neolithic sites (8.25–7.80 cal. ka) indicate resettlement following the event. The large run-up of this event highlights the disruptive impact of tsunamis on past societies along the Levantine coast.
- Published
- 2020
4. Numerical modelling of rapid, flow-like landslides across 3-D terrains: a Tsunami Squares approach to El Picacho landslide, El Salvador, September 19, 1982
- Author
-
Jiajia Wang, Lili Xiao, and Steven N. Ward
- Subjects
Geophysics ,Flow (mathematics) ,Geomechanics ,Geochemistry and Petrology ,Landslide ,Terrain ,Seismology ,Geology - Published
- 2015
- Full Text
- View/download PDF
5. Numerical simulation of the December 4, 2007 landslide-generated tsunami in Chehalis Lake, Canada
- Author
-
Lili Xiao, Jiajia Wang, and Steven N. Ward
- Subjects
Geophysics ,Computer simulation ,Geochemistry and Petrology ,Landslide ,Geomorphology ,Geology ,Seismology - Abstract
On December 4, 2007, a three million cubic metres landslide impacted Chehalis Lake, 80 km east of Vancouver, Canada. The failed mass rushed into the lake and parented a tsunami that ran up 38 m on the opposite shore, destroyed trees, roads and campsite facilities. Armed with field surveys and multihigh-tech observations from SONAR, LiDAR and orthophotographs, we apply the newly developed ‘Tsunami Squares’ method to simulate the Chehalis Lake landslide and its generated tsunami. The landslide simulation shows a progressive failure, flow speeds up to ∼60 m s–1, and a slide mass stoppage with uniform repose angle on the lakebed. Tsunami products suggest that landslide velocity and spatial scale influence the initial wave size, while wave energy decay and inundation heights are affected by a combination of distance to the landslide, bathymetry and shoreline orientation relative to the wave direction.
- Published
- 2015
- Full Text
- View/download PDF
6. Tsunami Squares Approach to Landslide-Generated Waves: Application to Gongjiafang Landslide, Three Gorges Reservoir, China
- Author
-
Lili Xiao, Jiajia Wang, and Steven N. Ward
- Subjects
Geophysics ,Geochemistry and Petrology ,Direct observation ,Numerical modeling ,Landslide ,Square (algebra) ,Geology ,Seismology ,Physics::Geophysics ,Three gorges - Abstract
We have developed a new method, named “Tsunami Squares”, for modeling of landslides and landslide-generated waves. The approach has the advantages of the previous “Tsunami Ball” method, for example, separate, special treatment for dry and wet cells is not needed, but obviates the use of millions of individual particles. Simulations now can be expanded to spatial scales not previously possible. The new method accelerates and transports “squares” of material that are fractured into new squares in such a way as to conserve volume and linear momentum. The simulation first generates landslide motion as constrained by direct observation. It then computes induced water waves, given assumptions about energy and momentum transfer. We demonstrated and validated the Tsunami Squares method by modeling the 2008 Three Gorges Reservoir Gongjiafang landslide and river tsunami. The landslide’s progressive failure, the wave generated, and its subsequent propagation and run-up are well reproduced. On a laptop computer Tsunami Square simulations flexibly handle a wide variety of waves and flows, and are excellent techniques for risk estimation.
- Published
- 2015
- Full Text
- View/download PDF
7. Reverberations on the Watery Element: A Significant, Tsunamigenic Historical Earthquake Offshore the Carolina Coast
- Author
-
Jeffrey W. Munsey, Susan E. Hough, and Steven N. Ward
- Subjects
Travel time ,South carolina ,Geophysics ,Oceanography ,Tsunami wave ,Intraplate earthquake ,Magnitude (mathematics) ,Submarine pipeline ,Tsunami earthquake ,Bay ,Geology ,Seismology - Abstract
We investigate an early nineteenth‐century earthquake that has been previously cataloged but not previously investigated in detail or recognized as a significant event. The earthquake struck at approximately 4:30 a.m. LT on 8 January 1817 and was widely felt throughout the southeastern and mid‐Atlantic United States. Around 11:00 a.m. the same day, an eyewitness described a 12‐inch tide that rose abruptly and agitated boats on the Delaware River near Philadelphia. We show that the timing of this tide is consistent with the predicted travel time for a tsunami generated by an offshore earthquake 6–7 hours earlier. By combining constraints provided by the shaking intensity distribution and the tsunami observation, we conclude that the 1817 earthquake had a magnitude of low‐ to mid‐ M 7 and a location 800–1000 km offshore of South Carolina. Our results suggest that poorly understood offshore source zones might represent a previously unrecognized hazard to the southern and mid‐Atlantic coast. Both observational and modeling results indicate that potential tsunami hazard within Delaware Bay merits consideration: the simple geometry of the bay appears to catch and focus tsunami waves. Our preferred location for the 1817 earthquake is along a diffuse northeast‐trending zone defined by instrumentally recorded and historical earthquakes. The seismotectonic framework for this region remains enigmatic.
- Published
- 2013
- Full Text
- View/download PDF
8. A Comparison among Observations and Earthquake Simulator Results for the allcal2 California Fault Model
- Author
-
M. K. Sachs, E. M. Heien, Fred F. Pollitz, M. Burak Yikilmaz, Terry E. Tullis, Donald L. Turcotte, Steven N. Ward, Louise H. Kellogg, John B. Rundle, Keith Richards-Dinger, Edward H. Field, Michael Barall, and James H. Dieterich
- Subjects
Engineering ,Geophysics ,Earthquake simulation ,business.industry ,Earthquake hazard ,Probability distribution ,Covariance ,Fault model ,business ,Scaling ,Seismology ,Simulation - Abstract
Online Material: Supplemental figures of space‐time and frequency‐magnitude relations, scaling plots, mean and covariance plots of interevent times, probability distribution functions of recurrence intervals, and earthquake density plots. In order to understand earthquake hazards we would ideally have a statistical description of earthquakes for tens of thousands of years. Unfortunately the ∼100‐year instrumental, several 100‐year historical, and few 1000‐year paleoseismological records are woefully inadequate to provide a statistically significant record. Physics‐based earthquake simulators can generate arbitrarily long histories of earthquakes; thus they can provide a statistically meaningful history of simulated earthquakes. The question is, how realistic are these simulated histories? This purpose of this paper is to begin to answer that question. We compare the results between different simulators and with information that is known from the limited instrumental, historic, and paleoseismological data. As expected, the results from all the simulators show that the observational record is too short to properly represent the system behavior; therefore, although tests of the simulators against the limited observations are necessary, they are not a sufficient test of the simulators’ realism. The simulators appear to pass this necessary test. In addition, the physics‐based simulators show similar behavior even though there are large differences in the methodology. This suggests that they represent realistic behavior. Different assumptions concerning the constitutive properties of the faults do result in enhanced capabilities of some simulators. However, it appears that the similar behavior of the different simulators may result from the fault‐system geometry, slip rates, and assumed strength drops, along with the shared physics of stress transfer. This paper describes the results of running four earthquake simulators that are described elsewhere in …
- Published
- 2012
- Full Text
- View/download PDF
9. THE 1958 LITUYA BAY LANDSLIDE AND TSUNAMI — A TSUNAMI BALL APPROACH
- Author
-
Steven N. Ward and Simon Day
- Subjects
geography ,Splash ,geography.geographical_feature_category ,Advection ,Glacier ,Landslide ,Rockslide ,Geotechnical Engineering and Engineering Geology ,Oceanography ,Inlet ,Bathymetric chart ,Geophysics ,Ball (bearing) ,Seismology ,Geology - Abstract
Many analyses of tsunami generation and inundation solve equations of continuity and momentum on fixed finite difference/finite element meshes. We develop a new approach that uses a momentum equation to accelerate bits or balls of water over variable depth topography. The thickness of the water column at any point equals the volume density of balls there. The new approach has several advantages over traditional methods: (1) by tracking water balls of fixed volume, the continuity equation is satisfied automatically and the advection term in the momentum equation becomes unnecessary. (2) The procedure is meshless in the finite difference/finite element sense. (3) Tsunami balls care little if they find themselves in the ocean or inundating land. We demonstrate and validate the tsunami ball method by simulating the 1958 Lituya Bay landslide and tsunami. We find that a rockslide of dimension and volume (3 - 6 × 107m3) generally consistent with observations can indeed tumble from 200–900 m height on the east slope of Gilbert Inlet, splash water up to ~ 500 m on the western slope, and make an impressive tsunami running down the length of the fiord. A closer examination of eyewitness accounts and trimline maps, however, finds a "rockslide only" tsunami somewhat lacking in size outside of Gilbert Inlet. This discrepancy, coupled with fact that ~ 3 × 108m3of sediment infilled the deepest parts of Lituya Bay between 1926 and 1959, suggests that the source of the 1958 tsunami was not one landslide, but two. The initial rockslide generated the famous big splash and cratered the floor in front of Lituya Glacier. We propose that the impact of the rockslide destabilized the foundation of the Glacier and triggered a second larger, but slower moving subglacier slide. The subglacier slide induced the fresh normal faults on the collapsed glacier above, helped to bulk up the rockslide tsunami outside of Gilbert Inlet, and supplied most of the infill evident in post-1958 bathymetric charts.
- Published
- 2010
- Full Text
- View/download PDF
10. Tsunami Hazard Evaluation of the Eastern Mediterranean: Historical Analysis and Selected Modeling
- Author
-
Steven N. Ward, Thomas K. Rockwell, Emanuela Guidoboni, Amos Salamon, and Alberto Comastri
- Subjects
Mediterranean climate ,Eastern mediterranean ,Dead sea ,Geophysics ,Flood myth ,Geochemistry and Petrology ,Tsunami hazard ,Range (biology) ,Submarine pipeline ,Active fault ,Seismology ,Geology - Abstract
Seismic sea waves in the eastern Mediterranean have been reported since written history first emerged several thousand years ago. We collected and investigated these ancient and modern reports to understand and model the typical tsunamigenic sources, with the ultimate purpose of characterizing tsunami hazard along the Levant coasts. Surprisingly, only 35% of the tsunami reports could be traced back to primary sources, with the balance remaining questionable. The tsunamis varied in size, from barely noticeable to greatly damaging, and their effects ranged from local to regional. Overall, we list 21 reliably reported tsunamis that occurred since the mid second century b.c. along the Levant coast, along with 57 significant historical earthquakes that originated from the “local” continental Dead Sea Transform (dst) system. An in-depth evaluation shows that 10 tsunamis are clearly associated with on-land dst earthquakes, and therefore, as formerly suggested, they probably originated from offshore, seismogenically induced slumps. Eight tsunamis arrived from the “remote” Hellenic and Cypriot Arcs, one from Italy, and two are left with as yet unrecognized sources. A major conclusion from this work is that onshore earthquakes commonly produce tsunamis along the Levant coastline, and that analogous situations are present elsewhere in the Mediterranean, as well as along the California coast and in another regions with active faults near the coast. We modeled three typical scenarios, and in light of the Sumatra experience, we examined the more likely severe magnitudes. This of course leads us toward the upper range of expected run-ups. The models show that sooner than five minutes after a strong earthquake produces an offshore slump, which occurs after close to a third of the large dst earthquakes, a 4- to 6-m run-up may flood part of the Syrian, Lebanese, and Israeli coasts. Tsunamis from remote earthquakes, however, arrive later and produce only 1- to 3-m run-ups, but are more regional in extent. Online material: Tsunami modeling and reports.
- Published
- 2007
- Full Text
- View/download PDF
11. Methods for Evaluating Earthquake Potential and Likelihood in and around California
- Author
-
Steven N. Ward
- Subjects
Seismic gap ,Earthquake scenario ,Geophysics ,Seismic microzonation ,Seismic hazard ,Earthquake simulation ,Earthquake prediction ,Urban seismic risk ,Induced seismicity ,Geodesy ,Geology ,Seismology - Abstract
From the outset, the vision of the Regional Earthquake Likelihood Models (RELM) project recognized that the best way to come to grips with the full impact and uncertainty in earthquake hazard estimates is to compare a wide range of independent, well-documented, and physically defensible hazard models that produce identically formatted output. Ideally, these models should be rooted in the complete spectrum of geophysical input. Toward this end, I offer testable earthquake potential maps based on geodesy, geology, historical seismicity, and computer simulations of earthquakes. Motivation. Until recently, earthquake rate estimation was entirely the domain of geologists and seismologists. With well-defined faults and sufficiently frequent earthquakes, geology, historical seismicity, and paleoseismology can furnish fairly reliable earthquake statistics. More commonly, questionable fault geometries, fault slip rates, fault rupture modes, and scattered seismicity characterize the situation, and earthquake statistics do not reveal themselves readily. For much of the world, historical seismicity and paleoseismology cannot constrain earthquake statistics to the degree necessary for an acceptable rate assessment. Today, information from space geodesy patches some of these voids. Space geodetic monitoring quantifies potential earthquake activity within a network even if that activity occurs on faults that are unknown, too slowly slipping, or too deep to study by conventional geological or seismological techniques. Geodesy's most valuable contributions in this arena spring from its ability to: Technical description. Geodetic earthquake potential maps require few inputs. This feature is both the beauty and the value of the approach. The steps in computing the maps include: 1. Compile a GPS velocity map for …
- Published
- 2007
- Full Text
- View/download PDF
12. Particulate kinematic simulations of debris avalanches: interpretation of deposits and landslide seismic signals of Mount Saint Helens, 1980 May 18
- Author
-
Steven N. Ward and Simon Day
- Subjects
geography ,geography.geographical_feature_category ,Bedrock ,Flow (psychology) ,Landslide ,Geophysics ,Kinematics ,Debris ,Physics::Geophysics ,Acceleration ,Geochemistry and Petrology ,Dynamical friction ,Net force ,Geology ,Seismology - Abstract
SUMMARY We construct a new class of granular landslide models in which avalanches are simulated with large numbers of independent particles moving under the influence of topographically derived gravitational and centripetal acceleration. Concurrently, the particles suffer deceleration due to basal and dynamic friction. The novel aspect of the calculation is that complex particle-toparticle interactions, fluctuating basal contacts, and unresolved topographic roughness within and below the deforming flow are mimicked by random perturbations in along-track and cross-slope acceleration. We apply the method to the 1980 May 18 Mount Saint Helens debris avalanche by constraining the initial geometry and structure of the slide mass from geological data, and the initial failure sequence from eyewitness accounts. After tuning coefficients of mechanical friction and random accelerations, the landslide simulation generates a final deposit whose extent, thickness, morphological structure and lithological variation closely replicate those observed. Moreover, the model avalanche is consistent kinematically with mapped patterns of bedrock scouring, deposit superelevation, and net force history implied from seismic records. To be successful, the slide mass must be divided into upper, high-friction and lower, low-friction members. This division corresponds to fresh, water-unsaturated and hydrothermally altered, water-saturated rock units and points to a mechanical explanation of the kinematics of the debris avalanche. Success in reproducing many features of the Mount Saint Helens avalanche indicates that debris-deposit data may be used to determine the kinematic histories of less well-observed landslides.
- Published
- 2006
- Full Text
- View/download PDF
13. Earthquake Simulation by Restricted Random Walks
- Author
-
Steven N. Ward
- Subjects
Scaling law ,Slip (materials science) ,Earthquake magnitude ,Random walk ,Power law ,Physics::Geophysics ,Stress drop ,Geophysics ,Earthquake simulation ,Geochemistry and Petrology ,Statistical inference ,Statistical physics ,Seismology ,Mathematics - Abstract
This article simulates earthquake slip distributions as restricted random walks. Random walks offer several unifying insights into earthquake behaviors that physically based simulations do not. With properly tailored variables, random walks generate observed power law rates of earthquake number versus earthquake magnitude (the Gutenberg–Richter relation). Curiously, b -value, the slope of this distribution, not only fixes the ratio of small to large events but also dictates diverse earthquake scaling laws such as mean slip versus fault length and moment versus mean slip. Moreover, b -value determines the overall shape and roughness of earthquake ruptures. For example, mean random walk quakes with b = −½ have elliptical slip distributions characteristic of a uniform stress drop on a crack. Random walk earthquake simulators, tuned by comparison with field data, provide improved bases for statistical inference of earthquake behavior and hazard.
- Published
- 2004
- Full Text
- View/download PDF
14. Ritter Island Volcano-lateral collapse and the tsunami of 1888
- Author
-
Steven N. Ward and Simon Day
- Subjects
Shore ,geography ,geography.geographical_feature_category ,Landslide ,Block (meteorology) ,Debris ,Geophysics ,Volcano ,Geochemistry and Petrology ,Period (geology) ,medicine ,Orders of magnitude (length) ,medicine.symptom ,Seismology ,Geology ,Collapse (medical) - Abstract
SUMMARY In the early morning of 1888 March 13, roughly 5 km 3 of Ritter Island Volcano fell violently into the sea northeast of New Guinea. This event, the largest lateral collapse of an island volcano to be recorded in historical time, flung devastating tsunami tens of metres high on to adjacent shores. Several hundred kilometres away, observers on New Guinea chronicled 3 min period waves up to 8 m high, that lasted for as long as 3 h. These accounts represent the best available first-hand information on tsunami generated by a major volcano lateral collapse. In this article, we simulate the Ritter Island landslide as constrained by a 1985 sonar survey of its debris field and compare predicted tsunami with historical observations. The best agreement occurs for landslides travelling at 40 m s −1 , but velocities up to 80 m s −1 cannot be excluded. The Ritter Island debris dropped little more than 800 m vertically and moved slowly compared with landslides that descend into deeper water. Basal friction block models predict that slides with shorter falls should attain lower peak velocities and that 40+ ms −1 is perfectly compatible with the geometry and runout extent of the Ritter Island landslide. The consensus between theory and observation for the Ritter Island waves increases our confidence in the existence of mega-tsunami produced by oceanic volcano collapses two to three orders of magnitude larger in scale.
- Published
- 2003
- Full Text
- View/download PDF
15. Asteroid impact tsunami of 2880 March 16
- Author
-
Steven N. Ward and Erik Asphaug
- Subjects
Geophysics ,Tsunami wave ,Oceanography ,Geochemistry and Petrology ,Asteroid ,Sedimentary rock ,Water velocity ,Deep sea ,Seismology ,Geology ,Landfall - Abstract
SUMMARY NASA scientists have given a 1.1-km diameter asteroid (1950 DA) a 0.0 to 0.3 per cent probability of colliding with the Earth in the year 2880. This article examines a scenario where 1950 DA strikes the sea 600 km east of the United States coast. Travelling at 17.8 km s−1, the asteroid would blow a cavity 19 km in diameter and as deep as the ocean (5 km) at the impact site. Tsunami waves hundreds of metres high would follow as the transient impact cavity collapses. The tsunami disperses quickly; but because the waves are so large initially, destructive energy carries basin-wide. Within two hours of the scenario impact, 100-m waves make landfall from Cape Cod to Cape Hatteras. Within 12 hours, 20-m waves arrive in Europe and Africa. Water velocity at the deep ocean floor exceeds 1 m s−1 to 800-km distance, strong enough to leave a widespread tsunami signature in the sedimentary record.
- Published
- 2003
- Full Text
- View/download PDF
16. Cumbre Vieja Volcano-Potential collapse and tsunami at La Palma, Canary Islands
- Author
-
Simon Day and Steven N. Ward
- Subjects
geography ,Atlantic hurricane ,geography.geographical_feature_category ,Tsunami wave ,biology ,Landslide ,Block (meteorology) ,biology.organism_classification ,Geophysics ,Vieja ,Volcano ,Natural hazard ,General Earth and Planetary Sciences ,Cumbre ,Geology ,Seismology - Abstract
Geological evidence suggests that during a future eruption, Cumbre Vieja Volcano on the Island of La Palma may experience a catastrophic failure of its west flank, dropping 150 to 500 km³ of rock into the sea. Using a geologically reasonable estimate of landslide motion, we model tsunami waves produced by such a collapse. Waves generated by the run-out of a 500 km³ (150 km³) slide block at 100 m/s could transit the entire Atlantic Basin and arrive on the coasts of the Americas with 10–25 m (3–8 m) height.
- Published
- 2001
- Full Text
- View/download PDF
17. Landslide tsunami
- Author
-
Steven N. Ward
- Subjects
Atmospheric Science ,Geophysics ,Ecology ,Space and Planetary Science ,Geochemistry and Petrology ,Earth and Planetary Sciences (miscellaneous) ,Paleontology ,Soil Science ,Forestry ,Aquatic Science ,Oceanography ,Earth-Surface Processes ,Water Science and Technology - Published
- 2001
- Full Text
- View/download PDF
18. San Francisco Bay Area Earthquake Simulations: A Step Toward a Standard Physical Earthquake Model
- Author
-
Steven N. Ward
- Subjects
Earthquake scenario ,Geophysics ,Earthquake casualty estimation ,Seismic hazard ,Seismic microzonation ,Earthquake simulation ,Geochemistry and Petrology ,Earthquake prediction ,Types of earthquake ,Earthquake warning system ,Geology ,Seismology - Abstract
Earthquakes in California's San Francisco Bay Area are likely to be more strongly affected by stress interaction than earthquakes in any other place in the world because of the region's closely spaced, subparallel distribution of faults. I believe, therefore, that meaningful quantification of earthquake probability and hazard in the Bay Area can be made only with the guidance provided by physically based and regionwide earthquake models that account for this interaction. This article represents a first step in developing a standard physical earthquake model for the San Francisco Bay Area through realistic, 3000-year simulations of earthquakes on all of the area's major faults. These simulations demonstrate that a standard physical earthquake model is entirely feasible, they illustrate its application, and they blueprint its construction. A standard physical earthquake model provides the mechanism to integrate fully the diverse disciplines within the earthquake research community. As a platform for data utilization and verification, a physical earthquake model can employ directly any earthquake property that is measurable in the field or in the laboratory to tune and test its seismicity products. As a platform for probability forecasts, a physical earthquake model can supply rational estimates of every imaginable earthquake statistic while simultaneously satisfying all slip and earthquake rate constraints. As a platform for hazard analysis, a physical earthquake model can compute earthquake shaking intensity from first principles by convolving a full suite of rupture scenarios with site-specific dislocation Green's functions. Physical earthquake models have advanced greatly in the last decade. Simulations of earthquake generation and recurrence are now sufficiently credible that such calculations can begin to take substantial roles in scientific studies of earthquake probability and hazard. Manuscript received 9 March 1999.
- Published
- 2000
- Full Text
- View/download PDF
19. On the consistency of earthquake moment release and space geodetic strain rates: Europe
- Author
-
Steven N. Ward
- Subjects
business.industry ,Geodetic network ,Geodetic datum ,Deformation (meteorology) ,Space (mathematics) ,Geodesy ,Moment (mathematics) ,Geophysics ,Geochemistry and Petrology ,Consistency (statistics) ,Very-long-baseline interferometry ,Global Positioning System ,business ,Seismology ,Geology - Abstract
In this paper, approximately 100 VLBI/SLR/GPS velocities map European strain rates from 9.0 × 10−8 yr−1 with regional uncertainties of 20 to 40 per cent. Kostrov’s formula translates these strain-rate values into regional geodetic moment rates M¯˙geodetic . Two other moment rates, M¯˙seismic , extracted from a 100-year historical catalogue and M¯˙plate , taken from plate-tectonic models, contrast the geodetic rates. In Mediterranean Europe, the ratios of M¯˙seismic to M¯˙geodetic are between 0.50 and 0.71. In Turkey the ratio falls to 0.22. Although aseismic deformation may contribute to the earthquake deficit (M¯˙seismic values less than M¯˙geodetic ), the evidence is not compelling because the magnitudes of the observed shortfalls coincide with the random variations expected in a 100-year catalogue. If the lack of aseismic deformation inferred from the 100-year catalogue holds true for longer periods, then much of Europe’s strain budget would have to be accommodated by more frequent or larger earthquakes than have been experienced this century to raise the ratios of M¯˙seismic to M¯˙geodetic to unity. Improved geological fault data bases, longer historical earthquake catalogues, and densification of the continent’s space geodetic network will clarify the roles of aseismic deformation versus statistical quiescence.
- Published
- 1998
- Full Text
- View/download PDF
20. On the consistency of earthquake moment rates, geological fault data, and space geodetic strain: the United States
- Author
-
Steven N. Ward
- Subjects
Peak ground acceleration ,business.industry ,Geodetic datum ,Induced seismicity ,Geodesy ,Geophysics ,Geochemistry and Petrology ,Very-long-baseline interferometry ,Global Positioning System ,Seismic moment ,business ,Basin and range topography ,Geology ,Seismology ,Statistic - Abstract
New and dense space geodetic data can now map strain rates over continental-wide areas with a useful degree of precision. Stable strain indicators open the door for space geodesy to join with geology and seismology in formulating improved estimates of global earthquake recurrence. In this paper, 174 GPS/VLBI velocities map United States’ strain rates of 30.0 × 10−8 yr−1 with regional uncertainties of 5 to 50 per cent. Kostrov’s formula translates these strain values into regional geodetic moment rates. Two other moment rates M¯˙seismic and M¯˙geologic , extracted from historical earthquake and geological fault catalogues, contrast the geodetic rate. Because M¯˙geologic , M¯˙seismic and M¯˙geodetic derive from different views of the earthquake engine, each illuminates different features. In California, the ratio of M¯˙geodetic to M¯˙geologic is 1.20. The near-unit ratio points to the completeness of the region’s geological fault data and to the reliability of geodetic measurements there. In the Basin and Range, northwest and central United States, both M¯˙geodetic and M¯˙seismic greatly exceed M¯˙geologic. Of possible causes, high incidences of understated and unrecognized faults probably drive the inconsistency. The ratio of M¯˙seismic to M¯˙geodetic is everywhere less than one. The ratio runs systematically from 70–80 per cent in the fastest straining regions to 2 per cent in the slowest. Although aseismic deformation may contribute to this shortfall, I argue that the existing seismic catalogues fail to reflect the long-term situation. Impelled by the systematic variation of seismic to geodetic moment rates and by the uniform strain drop observed in all earthquakes regardless of magnitude, I propose that the completeness of any seismic catalogue hinges on the product of observation duration and regional strain rate. Slowly straining regions require a proportionally longer period of observation. Characterized by this product, gamma distributions model statistical properties of catalogue completeness as proxied by the ratio of observed seismic moment to geodetic moment. I find that adequate levels of completeness should exist in median catalogues of 200 to 300 year duration in regions straining 10−7 yr−1 (comparable to southern California). Similar levels of completeness will take more than 20 000 years of earthquake data in regions straining 10−9 yr−1 (comparable to the southeastern United States). Predictions from this completeness statistic closely mimic the observed M¯˙seismic to M¯˙godetic ratios and allow quantitative responses to previously unanswerable questions such as: ‘What is the likelihood that the seismic moment extracted from an earthquake catalogue of X years falls within Y per cent of the true long-term rate?’ The combination of historical seismicity, fault geology and space geodesy offers a powerful tripartite attack on earthquake hazard. Few obstacles block similar analyses in any region of the world.
- Published
- 1998
- Full Text
- View/download PDF
21. Dogtails versus rainbows: Synthetic earthquake rupture models as an aid in interpreting geological data
- Author
-
Steven N. Ward
- Subjects
Shore ,Seismic gap ,geography ,geography.geographical_feature_category ,Trough (geology) ,Rainbow ,Slip (materials science) ,Stress drop ,Geophysics ,Geochemistry and Petrology ,Earthquake rupture ,Fault model ,Geology ,Seismology - Abstract
Geologists have been collecting, for decades, information from historical and paleoearthquakes that could contribute to the formulation of a “big picture” of the earthquake engine. Observations of large earthquake ruptures, unfortunately, are always going to be spotty in space and time, so the extent to which geological information succeeds in contributing to a grander view of earthquakes is going to be borne not only by the quantity and quality of data collected but also by the means by which it is interpreted. This article tries to understand geological data more fully through carefully tailored computer simulations of fault ruptures. Dogtails and rainbows are two types of fault rupture terminations that can be recognized in the field and can be interpreted through these models. Rainbows are concave down ruptures that indicate complete stress drop and characteristic slip. Rainbow terminations usually coincide with fault ends or strong segment boundaries. Dogtails are concave up ruptures that indicate incomplete stress drop and noncharacteristic slip. Dogtail terminations can happen anywhere along a fault or fault segment. The surface slip pattern of the magnitude 6.6, 1979 Imperial Valley, California, earthquake shows both dogtail and rainbow terminations. The rainbow confirms the presence of a strong fault segment boundary 6 km north of the international border that had been suggested by Sieh (1996). The dogtail implies that the displacement observed in 1979 is not characteristic. By combining paleoseismic information with the surface slip patterns from this event and the magnitude 7.1, 1940 Imperial Valley earthquake, I develop a quantitative Imperial fault model with northern, central, and southern segments possessing 50, 110, and 50 bar strength, respectively. Both the 1940 and 1979 events caused 1-m amplitude dogtailed ruptures of the northern segment; however, characteristic slip of the segment is more likely to be about 3 m. To illustrate the full spectrum of potential rupture modes, models were run forward in time to generate a 2000 year rupture “encyclopedia.” Even with well-constrained segmentation and strengths, modest changes in two friction law parameters produce several plausible histories. Further discrimination awaits analysis of the extensive paleoseismic record that geologists believe exists in the shore deposits of the intermittent lakes of the Salton Trough.
- Published
- 1997
- Full Text
- View/download PDF
22. More onMmax
- Author
-
Steven N. Ward
- Subjects
Stress drop ,Observational evidence ,Geophysics ,Geochemistry and Petrology ,Earthquake hazard ,Earthquake magnitude ,Maximum magnitude ,Slip (materials science) ,Hazard analysis ,Geology ,Seismology - Abstract
M max the maximum magnitude earthquake that a fault is likely to suffer, plays an important role in earthquake hazard estimation. Although observational evidence summarized in plots of characteristic earthquake magnitude (Mchar) versus fault length indicate that smaller faults produce lower magnitude events, an argument has been made that any fault regardless of its length should have Mmax near magnitude 8. The rationale for this argument charges that the contrary observational evidence stems from historical catalogs of limited extent and that it largely excludes nonconventional earthquakes in which several short and apparently disconnected fault segments fail simultaneously. This article addresses Mmax using computer models of rupture on faults of various strengths and configurations. Computer models have advantages in that (a) Mmax earthquakes always can be generated by forcing complete stress drop on fully stressed faults, thus avoiding the limitations of short historical catalogs, and (b) the circumstances necessary for the failure of several segments to contribute to a large Mmax can be investigated quantitatively. I find that for a strikeslip California environment, it is physically unlikely for an M 8 event to break less than 300 to 400 km of fault. Were this M 8 rupture to occur on as few as five independent segments, shear strength of the participating faults would have to be raised to implausible levels. If the fault segments are not independent and their coseismic stress fields interact, then amplifications in slip are possible without drastic increase in strength. The range of fault geometries where strong interactions and amplifications of stress occur, however, is very restricted, and discontinuous faults separated by even 5% of their length act more or less independently. Mmax earthquakes breading realistic-looking distributions of discontinuous faults rarely are more than 0.1-magnitude unit bigger than would be predicted from a moment summation based on the characteristic magnitude Mchar of each of the individual faults. A prudent course in hazard analysis differentiates Mmax from Mchar allowing Mmax to be 0.2 to 0.3 units larger than Mchar but not automatically equal to 8.
- Published
- 1997
- Full Text
- View/download PDF
23. A synthetic seismicity model for southern California: Cycles, probabilities, and hazard
- Author
-
Steven N. Ward
- Subjects
Atmospheric Science ,Ecology ,Earthquake prediction ,Paleontology ,Soil Science ,Forestry ,Slip (materials science) ,Aquatic Science ,Induced seismicity ,Oceanography ,Power law ,Standard deviation ,Geophysics ,Seismic hazard ,Space and Planetary Science ,Geochemistry and Petrology ,Earthquake hazard ,Earth and Planetary Sciences (miscellaneous) ,Spatial variability ,Seismology ,Geology ,Earth-Surface Processes ,Water Science and Technology - Abstract
The absence of a long historical catalog of observed seismicity with which to constrain earthquake recurrence behaviors is a fundamental stumbling block to earthquake prediction in California. Conceding that this limitation is not likely to relax in the foreseeable future, alternative approaches must be sought to extend the catalog artificially. In this article, I evaluate the long-term behaviors of earthquakes on a map-like set of southern California faults through computer simulations that incorporate the physics of earthquake stress transfer and are constrained by excellent, but restricted, bodies of geological and seismological data. I find that model seismicity fluctuates on both short (decades) and long (centuries) timescales but that it possesses a well-defined mean and standard deviation. Seismicity fluctuations correlate across different magnitudes, and the long-term cycles of smaller events seem to lead cycles of larger events. Short-period seismicity fluctuations do not exhibit this tendency, and short-term changes in low-magnitude (M5 +) seismicity are not likely to be an effective predictor of future large events, at least for the region as a whole. As do real faults, the model faults produce characteristic and power law quakes in variable ratios with diverse periodic and nonperiodic behaviors. Generally, larger events tend to occur quasi-periodically, and smaller ones tend to cluster; however, only for a few earthquake classes and certain locations is recurrence notably non-Poissonian. An important use of synthetic seismicity is in the construction of earthquake hazard maps because it firmly grounds previously ad hoc assumptions regarding frequency-magnitude distributions, multiple-segment failure statistics, and rupture extents, while satisfying a spectrum of geological constraints such as fault slip rate, segment recurrence interval, and slip per event. With its depth of temporal and spatial coverage, synthetic seismicity also provides a means to investigate the time dependence of seismic hazard. Because hazard likelihood is a concatenation of the recurrence statistics from many seismic sources, in only about 40–50% of the regions near the major faults do sequences of 0.1 g or 0.2 g exceedances differ from Poissonian.
- Published
- 1996
- Full Text
- View/download PDF
24. Progressive growth of San Clemente Island, California, by blind thrust faulting: implications for fault slip partitioning in the California Continental Borderland
- Author
-
Gianluca Valensise and Steven N. Ward
- Subjects
Geophysics ,Pleistocene ,Geochemistry and Petrology ,Gps data ,Thrust fault ,Submarine pipeline ,Slip (materials science) ,Fault slip ,Strike-slip tectonics ,Marine terrace ,Geology ,Seismology - Abstract
We find that the genesis of San Clemente Island and its surrounding submarine platform is consistent with progressive slip on two, southeast-striking, southwest-dipping, blind thrust fault segments. Since their inception 2 to 5 Ma, 3 km of compression normal to the N150°E fault strike has been accommodated with 1700 m of domal uplift of the San Clemente Anticlinorium. The existence of an extensive suite of Pleistocene marine terraces provides evidence that slip and uplift are continuing today. Based on direct terrace fossil age determinations and correlations of terrace heights with global sea-level curves, we estimate that San Clemente Island is currently uplifting at between 0.2 and 0.5 mm yr−1. This translates into 0.6–1.5 mm yr−1 of thrusting on the causative blind thrusts beneath the island. Unlike the situation at nearby Palos Verdes, where a simple twist in a regional strike-slip fault accommodated both fault-parallel and fault-normal motions, the shallow dips of the thrusts suggest that, if regional strike-slip motion on the San Clemente Fault exists, it must be partitioned onto through-going surfaces distinct from the thrusts. Current GPS data are sparse and equivocal, but they indicate that 1–4 mm yr−1 of compression and 4–7 mm yr−1 of strike slip are absorbed in the California Continental Borderland. With the Palos Verdes Fault taking some 3 mm yr−1 from the strike-slip budget, 1–4 mm yr−1 of motion could be present on a through-going San Clemente Fault. When translated into an annual moment release rate using Kostrov's formula, GPS strains predict that between 2.5 and 4.9 times 1017 N m yr−1 of earthquake potential is available offshore from San Diego to the Santa Barbara Channel. Distribution of this moment budget among various earthquake magnitudes is arguable, but we predict that M > 6 quakes in the Borderland could recur between 30 and 80 years, and M > 7 quakes might be found every 310 to 580 years.
- Published
- 1996
- Full Text
- View/download PDF
25. Area-based tests of long-term seismic hazard predictions
- Author
-
Steven N. Ward
- Subjects
Hazard (logic) ,Peak ground acceleration ,Geophysics ,Seismic hazard ,Geochemistry and Petrology ,Earthquake prediction ,Statistics ,Hazard ratio ,Environmental science ,Cutoff ,Hazard map ,Term (time) - Abstract
This article develops several area-based tests of long-term seismic hazard predictions. The tests stem from the hypothesis that the observed fractional area of hazard exceedence should follow in proportion to the region's predicted likelihood of exceedence. For example, a prediction is successful if roughly 30% of the area mapped as having a 30% likelihood of exceeding some hazard threshold in a certain time interval, actually does suffer this level of shaking. Although tests of earthquake predictions are always equivocal, the success or failure of a forecast hazard map can be argued strongly from a statistical assessment of the hundreds of individual point predictions comprising the map. The specific forecasts to be examined are the 30-yr probabilities of peak ground acceleration exceedence that were presented by Ward (1994) for southern California. In lieu of a lengthy historical set of recorded peak accelerations, observational data for the tests were derived from the 150-yr earthquake catalog and standard attenuation relationships. In all cases, the observed hazard ratios were highly correlated with the predictions. Within 95% confidence bounds, the observed levels of hazard coincided with the predicted hazard from models that included a low-magnitude cutoff of M = 5.5 to 6, approximately the same limit as enlisted in the earthquake catalog. I propose a pass/fail criterion for acceleration hazard maps and I suggest that all seismic hazard predictions submit to area-based tests.
- Published
- 1995
- Full Text
- View/download PDF
26. A multidisciplinary approach to seismic hazard in southern California
- Author
-
Steven N. Ward
- Subjects
Earthquake scenario ,Geophysics ,Seismic hazard ,Geochemistry and Petrology ,Urban seismic risk ,Seismic moment ,Paleoseismology ,2008 California earthquake study ,Induced seismicity ,Seismic risk ,Seismology ,Geology - Abstract
A serious obstacle facing seismic hazard assessment in southern California has been the characterization of earthquake potential in areas far from known major faults where historical seismicity and paleoseismic data are sparse. This article attempts to fill the voids in earthquake statistics by generating “master model” maps of seismic hazard that blend information from geology, paleoseismology, space geodesy, observational seismology, and synthetic seismicity. The current model suggests that about 40% of the seismic moment release in southern California could occur in widely scattered areas away from the principal faults. As a result, over a 30-yr period, nearly all of the region from the Pacific Ocean to 50 km east of the San Andreas Fault has a greater than 50/50 chance of experiencing moderate shaking of 0.1 g or greater, and about a 1 in 20 chance of suffering levels exceeding 0.3 g. For most of the residents of southern California, thelion's share of hazard from moderate earthquake shaking over a 30-yr period derives from smaller, closer, more frequent earthquakes in the magnitude range (5 ≦ M ≦ 7) rather than from large San Andreas ruptures, whatever their likelihood.
- Published
- 1994
- Full Text
- View/download PDF
27. Constraints On the Seismotectonics of the Central Mediterranean From Very Long Baseline Interferometry
- Author
-
Steven N. Ward
- Subjects
Mediterranean climate ,geography ,geography.geographical_feature_category ,Seismotectonics ,Block (meteorology) ,Geodesy ,African Plate ,Geophysics ,Geochemistry and Petrology ,Peninsula ,Very-long-baseline interferometry ,Compression (geology) ,Seismology ,Historical record ,Geology - Abstract
SUMMARY Very Long Baseline Interferometry (VLBI) determined site velocities from seven stations in western Europe reveal a stable continental platform north of the Alps. Deformations between Sweden, Germany and Spain can not exceed 2 mm yr-'. South of the Alps, significant motions are occurring with respect to stable Europe. Two sites east of the Apennine mountains on peninsular Italy have north-easttrending velocities which increase from 2 mm yr-l in the north at Medicina to 6 mm yr-l in the south at Matera. In contrast, the VLBI site in the south-eastern corner of Sicily is moving 7 mm yr-I to the north-north-west. These velocities are largely explained if southern Sicily is attached to a north-westerly moving African Plate and the eastern portion of the Italian peninsula forms part of a hypothesized Adriatic Sea crustal block which is rotating counter-clockwise with respect to Europe about a pole near the Alps. Such an explanation is consistent with the styles of the larger historical earthquakes of the region, which show NE-SW extension across the Apennines, north-south convergence across the Alps, and NE-SW compression in coastal Yugoslavia. The Adria plate model generates 1.5-2.2 X 10l8 Nm yr-l of potential earthquake moment along the northern and central Apennines. Historical records suggest that 30-60 per cent of this moment is released seismically. Based on a direct-strain-rate measurement, recurrence intervals for Italian earthquakes south of Medicina are estimated to be 12-46 yr for M6.5+ quakes and 35-143 years for M7.0+ quakes.
- Published
- 1994
- Full Text
- View/download PDF
28. The Palos Verdes terraces, California: Bathtub rings from a buried reverse fault
- Author
-
Steven N. Ward and Gianluca Valensise
- Subjects
Atmospheric Science ,geography ,geography.geographical_feature_category ,Ecology ,Paleontology ,Soil Science ,Forestry ,Slip (materials science) ,Aquatic Science ,Induced seismicity ,Geodynamics ,Oceanography ,Tectonics ,Geophysics ,Sinistral and dextral ,Space and Planetary Science ,Geochemistry and Petrology ,Peninsula ,Earth and Planetary Sciences (miscellaneous) ,Thrust fault ,Fault model ,Geology ,Seismology ,Earth-Surface Processes ,Water Science and Technology - Abstract
Uplift of the Palos Verdes peninsula has long been associated with a northwest trending, southwest dipping, reverse fault. Unfortunately, the Palos Verdes Hills fault has no obvious surface displacement and little background seismicity to substantiate its dimension, orientation, or earthquake potential. In this paper we investigate the tectonic style and slip rate of the Palos Verdes Hills fault and the uplift history of the Palos Verdes Hills by analyzing the geometry of 13 marine terraces that encircle the peninsula in a bathtub ring configuration. Elevations of 211 terrace remnants constrain a fault model with 3.0 to 3.7 mm yr−1 of oblique, dextral/reverse slip on a fault dipping 67° at 6 to 12 km depth beneath the peninsula. If the rate was constant through time, fault inception would have occurred 2.4–3.0 Ma. We propose that the largest credible earthquakes on the fault have magnitude ≈6¾ and could revisit every 2000 years.
- Published
- 1994
- Full Text
- View/download PDF
29. How regularly do earthquakes recur? A synthetic seismicity model for the San Andreas Fault
- Author
-
Saskia Goes and Steven N. Ward
- Subjects
Seismic gap ,geography ,geography.geographical_feature_category ,San andreas fault ,Magnitude (mathematics) ,Induced seismicity ,Fault (geology) ,As Long As Needed ,Physics::Geophysics ,Geophysics ,Aperiodic graph ,General Earth and Planetary Sciences ,Geology ,Seismology ,Slip rate - Abstract
We have been attempting to improve estimates of long-term earthquake recurrence probabilities along the San Andreas Fault by means of synthetic seismicity calculations. The calculations are based on the concept of fault segmentation and incorporate the physics of static dislocation theory. Forecasts constructed from synthetic seismicity are robust in that: they embody regional seismicity information over several units of magnitude; they tie together in a physical manner, a spectrum of fault segment features such as length, strength, characteristic magnitude, mean repeat time and slip rate; they can reasonably account for fault segment interactions; and they are formulated from a catalog which can be extended as long as needed to be statistically significant. We contend earthquake recurrence is more aperiodic than previously thought and, as a result, probabilities of major San Andreas earthquakes should be revised downward.
- Published
- 1993
- Full Text
- View/download PDF
30. Backarc thrust faulting and tectonic uplift along the Caribbean Sea Coast during the April 22, 1991 Costa Rica earthquake
- Author
-
Steven N. Ward and George Plafker
- Subjects
geography ,geography.geographical_feature_category ,Intertidal zone ,Slip (materials science) ,Coral reef ,Driftwood ,Geophysics ,Tectonic uplift ,Geochemistry and Petrology ,Beach ridge ,Thrust fault ,Submarine pipeline ,Seismology ,Geology - Abstract
Surface deformation and a tsunami accompanied the destructive April 22, 1991, Costa Rica-Panama earthquake (Ms = 7.5). Along a 135 km stretch of Caribbean coast, coseismic uplift was measured between the lower and upper limits of sessile intertidal organisms stranded on coral reefs, the preearthquake and postearthquake high tide levels located from driftwood lines on beaches, and the preearthquake and postearthquake tide levels as pointed out by local residents. The nature and distribution of offshore vertical displacements were further constrained from analysis of measured run-up heights and reported arrival times of the tsunami. Uplift detected along the coast jumped, within 4 km, from zero to 157 cm near Limon and generally decreased over a distance of 70 km southward to the border with Panama. These data map an axis of uplift that intersects the coastal beach ridge just north of the port of Moin and runs offshore to the east and south roughly parallel to the coast. No surface faulting was found. The earthquake and tsunami were generated by backarc thrusting along faults that bound the north Panama deformed belt and dip from the Caribbean Sea beneath Costa Rica and northern Panama. Combined geodetic and seismological data indicate that the main rupture dips landward at an angle of about 30° and is approximately 40 km wide and 80 km long. Dislocation models suggest 2.2 m of slip on a causative thrust fault striking between 105° and 120°. We estimate that the repeat time for this type of earthquake is 200 to 1100 years. The historical record and new isotopic data favor the middle of the range.
- Published
- 1992
- Full Text
- View/download PDF
31. A Tsunami Ball Approach to Storm Surge and Inundation: Application to Hurricane Katrina, 2005
- Author
-
Steven N. Ward
- Subjects
Meteorology ,Article Subject ,Advection ,lcsh:QC801-809 ,Storm surge ,Volume density ,Physics::Geophysics ,Momentum ,lcsh:Geophysics. Cosmic physics ,Geophysics ,Hurricane katrina ,Continuity equation ,Ball (bearing) ,Geology ,Physics::Atmospheric and Oceanic Physics ,Water Science and Technology - Abstract
Most analyses of storm surge and inundation solve equations of continuity and momentum on fixed finite-difference/finite-element meshes. I develop a completely new approach that uses a momentum equation to accelerate bits or balls of water over variable depth topography. The thickness of the water column at any point equals the volume density of balls there. In addition to being more intuitive than traditional methods, the tsunami ball approach has several advantages. (a) By tracking water balls of fixed volume, the continuity equation is satisfied automatically and the advection term in the momentum equation becomes unnecessary. (b) The procedure is meshless in the finite-difference/finite-element sense. (c) Tsunami balls care little if they find themselves in the ocean or inundating land. (d) Tsunami ball calculations of storm surge can be done on a laptop computer. I demonstrate and calibrate the method by simulating storm surge and inundation around New Orleans, Louisiana caused by Hurricane Katrina in 2005 and by comparing model predictions with field observations. To illustrate the flexibility of the tsunami ball technique, I run two “What If” hurricane scenarios—Katrina over Savannah, Georgia and Katrina over Cape Cod, Massachusetts.
- Published
- 2009
- Full Text
- View/download PDF
32. A synthetic seismicity model for the Middle America Trench
- Author
-
Steven N. Ward
- Subjects
Atmospheric Science ,Chaotic ,Soil Science ,Magnitude (mathematics) ,Aquatic Science ,Fault (geology) ,Induced seismicity ,Oceanography ,Physics::Geophysics ,Geochemistry and Petrology ,Earth and Planetary Sciences (miscellaneous) ,Aftershock ,Earth-Surface Processes ,Water Science and Technology ,geography ,geography.geographical_feature_category ,Ecology ,Paleontology ,Forestry ,Foreshock ,Geophysics ,Space and Planetary Science ,Trench ,Probability distribution ,Seismology ,Geology - Abstract
A novel iterative technique, based on the concept of fault segmentation and computed using 2D static dislocation theory, for building models of seismicity and fault interaction which are physically acceptable and geometrically and kinematically correct, is presented. The technique is applied in two steps to seismicity observed at the Middle America Trench. The first constructs generic models which randomly draw segment strengths and lengths from a 2D probability distribution. The second constructs predictive models in which segment lengths and strengths are adjusted to mimic the actual geography and timing of large historical earthquakes. Both types of models reproduce the statistics of seismicity over five units of magnitude and duplicate other aspects including foreshock and aftershock sequences, migration of foci, and the capacity to produce both characteristic and noncharacteristic earthquakes. Over a period of about 150 yr the complex interaction of fault segments and the nonlinear failure conditions conspire to transform an apparently deterministic model into a chaotic one.
- Published
- 1991
- Full Text
- View/download PDF
33. The 1960 Chile earthquake: inversion for slip distribution from surface deformation
- Author
-
Steven N. Ward and Sergio Barrientos
- Subjects
Geophysics ,Subduction ,Geochemistry and Petrology ,Pacific Plate ,Lithosphere ,Seismic moment ,Tide gauge ,Slip (materials science) ,Episodic tremor and slip ,Seismic risk ,Geodesy ,Seismology ,Geology - Abstract
SUMMARY A total of 166 observations of sea-level change, 130 measurements of elevation difference, and 16 determinations of horizontal strain provide an excellent view of the (quasi-)static source process of the great 1960 Chilean earthquake. These surface deformation data were employed in classical uniform slip fault models as well as more recently developed models that allow spatial variability of slip. The best uniform slip planar (USP) model is 850km long, 130km wide, and dips 20°. Seventeen metres of fault displacement contributed to a USP moment of 9.4 times 1022 N m. The variable slip planar (VSP) model concentrates slip on a 900 km long, 150 km wide band parallel to the coast. Several peaks of slip with dimensions of 50–100 km appear in this band and are thought to represent major subduction zone asperities. Important fractures of the oceanic lithosphere bound the 1960 rupture and are offered as a potential source of fault segmentation within the Chilean subduction zone. The VSP moment for 1960 earthquake totals 9.5 times 1022 N m, about one fifth of the value estimated for the foreshock-mainshock sequence from seismic methods. Except for areas out to sea, geodetic resolution on the fault is fairly uniform. Thus, it is unlikely that slip missed by the network could increase the VSP moment much beyond 1.8 times 1023 N m. Several patches of moment, isolated from the main body at 80–110 km depth, are found down dip in the VSP model and are presumably indicative of aseismic slip. One patch at the northern end of the rupture is probably associated with the initiation phase of the mainshock, although the time sequence of the relationship is unknown. Tide gauge records suggest that another patch between 40° and 43° S, responsible for the observed strain and uplifts inland at those latitudes, is not of coseismic origin, but derives from in-place, post-seismic creep over several years. Apparently, great 1960-type events are not typical members of the ∼ 128 yr earthquake cycle in south-central Chile. The Nazca-South America boundary here is characterized by a variable rupture mode in which major asperities are completely broken by great earthquakes only once in four or five earthquake cycles. The more frequent large earthquakes, that geographically overlap the great events, fill in between the locked zones.
- Published
- 1990
- Full Text
- View/download PDF
34. Source parameters of the great Sumatran megathrust earthquakes of 1797 and 1833 inferred from coral microatolls
- Author
-
Steven N. Ward, Mohamed Chlieh, Kerry Sieh, Danny H. Natawidjaja, Hai Cheng, R. Lawrence Edwards, Bambang W. Suwargadi, Jean Philippe Avouac, and John Galetzka
- Subjects
Atmospheric Science ,010504 meteorology & atmospheric sciences ,Coral ,Soil Science ,Intertidal zone ,Magnitude (mathematics) ,Paleoseismology ,Aquatic Science ,010502 geochemistry & geophysics ,Oceanography ,Megathrust earthquake ,01 natural sciences ,Latitude ,Geochemistry and Petrology ,Earth and Planetary Sciences (miscellaneous) ,14. Life underwater ,Sea level ,0105 earth and related environmental sciences ,Earth-Surface Processes ,Water Science and Technology ,Ecology ,Paleontology ,Forestry ,Geophysics ,Space and Planetary Science ,Island arc ,Geology ,Seismology - Abstract
Large uplifts and tilts occurred on the Sumatran outer arc islands between 0.5° and 3.3°S during great historical earthquakes in 1797 and 1833, as judged from relative sea level changes recorded by annually banded coral heads. Coral data for these two earthquakes are most complete along a 160-km length of the Mentawai islands between 3.2° and 2°S. Uplift there was as great as 0.8 m in 1797 and 2.8 m in 1833. Uplift in 1797 extended 370 km, between 3.2° and 0.5°S. The pattern and magnitude of uplift imply megathrust ruptures corresponding to moment magnitudes (M_w) in the range 8.5 to 8.7. The region of uplift in 1833 ranges from 2° to at least 3.2°S and, judging from historical reports of shaking and tsunamis, perhaps as far as 5°S. The patterns and magnitude of uplift and tilt in 1833 are similar to those experienced farther north, between 0.5° and 3°N, during the giant Nias-Simeulue megathrust earthquake of 2005; the outer arc islands rose as much as 3 m and tilted toward the mainland. Elastic dislocation forward modeling of the coral data yields megathrust ruptures with moment magnitudes ranging from 8.6 to 8.9. Sparse accounts at Padang, along the mainland west coast at latitude 1°S, imply tsunami runups of at least 5 m in 1797 and 3–4 m in 1833. Tsunamis simulated from the pattern of coral uplift are roughly consistent with these reports. The tsunami modeling further indicates that the Indian Ocean tsunamis of both 1797 and 1833, unlike that of 2004, were directed mainly south of the Indian subcontinent. Between about 0.7° and 2.1°S, the lack of vintage 1797 and 1833 coral heads in the intertidal zone demonstrates that interseismic submergence has now nearly equals coseismic emergence that accompanied those earthquakes. The interseismic strains accumulated along this reach of the megathrust have thus approached or exceeded the levels relieved in 1797 and 1833.
- Published
- 2006
- Full Text
- View/download PDF
35. Paleogeodetic records of seismic and aseismic subduction from central Sumatran microatolls, Indonesia
- Author
-
Hai Cheng, R. Lawrence Edwards, Kerry Sieh, Danny H. Natawidjaja, John Galetzka, Bambang W. Suwargadi, and Steven N. Ward
- Subjects
Atmospheric Science ,Ecology ,Subduction ,Submersion (coastal management) ,Paleontology ,Soil Science ,Forestry ,Paleoseismology ,Aquatic Science ,Oceanography ,Geophysics ,Space and Planetary Science ,Geochemistry and Petrology ,Trench ,Earth and Planetary Sciences (miscellaneous) ,Tide gauge ,Aseismic slip ,Far East ,Seismology ,Sea level ,Geology ,Earth-Surface Processes ,Water Science and Technology - Abstract
We utilize coral microatolls in western Sumatra to document vertical deformation associated with subduction. Microatolls are very sensitive to fluctuations in sea level and thus act as natural tide gauges. They record not only the magnitude of vertical deformation associated with earthquakes (paleoseismic data), but also continuously track the long-term aseismic deformation that occurs during the intervals between earthquakes (paleogeodetic data). This paper focuses on the twentieth century paleogeodetic history of the equatorial region. Our coral paleogeodetic record of the 1935 event reveals a classical example of deformations produced by seismic rupture of a shallow subduction interface. The site closest to the trench rose 90 cm, whereas sites further east sank by as much as 35 cm. Our model reproduces these paleogeodetic data with a 2.3 m slip event on the interface 88 to 125 km from the trench axis. Our coral paleogeodetic data reveal slow submergence during the decades before and after the event in the areas of coseismic emergence. Likewise, interseismic emergence occurred before and after the 1935 event in areas of coseismic submergence. Among the interesting phenomenon we have discovered in the coral record is evidence of a large aseismic slip or “silent event” in 1962, 27 years after the 1935 event. Paleogeodetic deformation rates in the decades before, after, and between the 1935 and 1962 events have varied both temporally and spatially. During the 25 years following the 1935 event, submergence rates were dramatically greater than in prior decades. During the past four decades, however, rates have been lower than in the preceding decades, but are still higher than they were prior to 1935. These paleogeodetic records enable us to model the kinematics of the subduction interface throughout the twentieth century.
- Published
- 2004
- Full Text
- View/download PDF
36. Planetary cratering: A probabilistic approach
- Author
-
Steven N. Ward
- Subjects
Atmospheric Science ,Population ,Soil Science ,Aquatic Science ,Oceanography ,Astrobiology ,Impact crater ,Geochemistry and Petrology ,Bolide ,Planet ,Earth and Planetary Sciences (miscellaneous) ,education ,Ejecta ,Earth-Surface Processes ,Water Science and Technology ,education.field_of_study ,Ecology ,Paleontology ,Forestry ,Geophysics ,Early Earth ,Regolith ,Space and Planetary Science ,Asteroid ,Geology - Abstract
[1] I develop several statistical indices of cratering on planetary surfaces based on Poissonian probability. These cratering formulas, being both analytic and probabilistic, have advantages over numerical or nonprobabilistic approaches in ease of calculation, clarity of interpretation, and evident flexibility. Specific indices developed include the fraction of a planet's surface expected to be cratered N occasions over a given time interval, expected uncertainty in crater coverage, the depth and distribution of developed megaregolith, and the evolution in crater population. For instance, under current conditions, 15% of the Earth's surface should have been cratered one or more times in the past 3 billion years. In the median case, ejecta from these impacts would have blanketed the planet to a depth of 313 m. These indices, of course, depend upon asteroid flux and the minimum asteroid size imposed by atmospheric filtering. Analytical formulas make it simple to account for such variations by feeding in their history from current conditions to those present on the early Earth. If, as is thought, Earth's bolide flux rate has decreased by a factor of 10,000 from its formation until now, then 99.99% of the Earth's surface should have been cratered one or more times in the past 3 billion years. In the first 100 Ma of the early Earth, 90% of its the surface would have suffered more than 50 impacts, even considering the protection of the atmosphere. Ramifications of these bombardment statistics pertain to the survivability of crustal fragments and early life forms.
- Published
- 2002
- Full Text
- View/download PDF
37. Crustal deformation at the Sumatran subduction zone revealed by coral rings
- Author
-
Kerry Sieh, Steven N. Ward, Danny H. Natawidjaja, and Bambang W. Suwargadi
- Subjects
geography ,geography.geographical_feature_category ,Subduction ,Coral ,Microatoll ,Atoll ,Coral reef ,Paleontology ,Geophysics ,Oceanic crust ,Trench ,General Earth and Planetary Sciences ,Science::Geology [DRNTU] ,Reef ,Geology ,Seismology - Abstract
Analyses of coral rings grown in the interval 1970–1997 reveal a geographically distinct pattern of interseismic uplift off Sumatra's western coast. At distances less than 110 km from the Sumatran trench, coral reefs are submerging as fast as 5 mm/y. At 130 and 180 km distance from the trench, they are emerging at similar rates. We suggest that a locked, or partially locked patch, located above 30 km depth on the upper surface of the subducting oceanic plate, generates this pattern. Published version
- Published
- 1999
38. Correction [to 'Backarc thrust faulting and tectonic uplift along the Caribbean Sea Coast during the April 22, 1991 Costa Rica earthquake' by G. Plafker and S. N. Ward]
- Author
-
Steven N. Ward and George Plafker
- Subjects
Sea coast ,Geophysics ,Tectonic uplift ,Geochemistry and Petrology ,Thrust fault ,Geology ,Seismology - Published
- 1992
- Full Text
- View/download PDF
39. An application of synthetic seismicity in earthquake statistics: The Middle America Trench
- Author
-
Steven N. Ward
- Subjects
Atmospheric Science ,geography ,geography.geographical_feature_category ,Ecology ,Paleontology ,Soil Science ,Forestry ,Slip (materials science) ,Aquatic Science ,Fault (geology) ,Induced seismicity ,Oceanography ,Tectonics ,Geophysics ,Space and Planetary Science ,Geochemistry and Petrology ,Aperiodic graph ,Trench ,Earth and Planetary Sciences (miscellaneous) ,Predictability ,Seismology ,Geology ,Earth-Surface Processes ,Water Science and Technology ,Weibull distribution - Abstract
The way in which seismicity calculations which are based on the concept of fault segmentation incorporate the physics of faulting through static dislocation theory can improve earthquake recurrence statistics and hone the probabilities of hazard is shown. For the Middle America Trench, the spread parameters of the best-fitting lognormal or Weibull distributions (about 0.75) are much larger than the 0.21 intrinsic spread proposed in the Nishenko Buland (1987) hypothesis. Stress interaction between fault segments disrupts time or slip predictability and causes earthquake recurrence to be far more aperiodic than has been suggested.
- Published
- 1992
- Full Text
- View/download PDF
40. The Loma Prieta Earthquake of October 17 1989: Introduction to the special issue
- Author
-
Karen C. McNally and Steven N. Ward
- Subjects
Seismic gap ,Quake (natural phenomenon) ,geography ,geography.geographical_feature_category ,San andreas fault ,Slip (materials science) ,Fault (geology) ,Strike-slip tectonics ,Geophysics ,General Earth and Planetary Sciences ,2008 California earthquake study ,Hazard estimation ,Geology ,Seismology - Abstract
The southern Santa Cruz Mountains segment of the San Andreas fault that ruptured in the large (M{sub s}=7.1, M{sub w}=6.9) earthquake of October 17, 1989 (October 18, universal time) last slipped 84 years ago when it formed the southern terminus of the great 1906 break which extended from Cape Mendocino to San Juan Bautista. Being the most significant reactivation of a major fault segment in California during the modern instrumental era, the Loma Prieta earthquake has justifiably attracted considerable attention. It is, therefore, appropriate to devote a Special Section of Geophysical Research Letter (GRL) to the preliminary scientific findings about the quake. The Loma Prieta event was especially challenging. Unlike the textbook illustration of a right lateral fault, slip in this earthquake was not shallow nor uniform; indeed, rupture did not even reach the surface. More amazing still, a significant component of displacement on this supposed strike slip fault was vertical. As many pre-conceptions went up in smoke, many new questions materialized: How and where is strain energy stored along the fault Is the San Andreas beneath the Santa Cruz Mountains complicated, with several intersecting strands which break differently at different times How are lateral and vertical motion accommodated versusmore » depth and time To what extent can a broad spectrum of proposed fault zone heterogeneities and structures be imaged Is the Loma Prieta failure characteristic for this section of the San Andreas and should it be grouped with the 1865 and 1906 events for hazard estimation Finally, just how predictable was the 1989 quake« less
- Published
- 1990
- Full Text
- View/download PDF
41. Pacific-North America Plate motions: New results from very long baseline interferometry
- Author
-
Steven N. Ward
- Subjects
Atmospheric Science ,Ecology ,San andreas fault ,Pacific Plate ,Relative motion ,Paleontology ,Soil Science ,Forestry ,Aquatic Science ,Geodynamics ,Oceanography ,Geodesy ,Plate tectonics ,Geophysics ,Space and Planetary Science ,Geochemistry and Petrology ,Lithosphere ,Earthquake hazard ,Very-long-baseline interferometry ,Earth and Planetary Sciences (miscellaneous) ,Geology ,Seismology ,Earth-Surface Processes ,Water Science and Technology - Abstract
The state of Pacific-North America plate interaction is updated using newest VLBI measurements and newly developed rigid plate tectonic models. Particular attention is given to examining the extent of relative motion between the Pacific plate and the North America plate as measured from their stable interiors, the evidence of Pacific plate deformation off the central California coast, and the distribution of path integrated deformaton east of the San Andreas fault. The information obtained on these questions is discussed in the framework of implications for lithospheric rheology and earthquake hazard.
- Published
- 1990
- Full Text
- View/download PDF
42. Relationships of tsunami generation and an earthquake source
- Author
-
Steven N. Ward
- Subjects
Point source ,Normal mode ,General Earth and Planetary Sciences ,Submarine ,Crust ,Slip (materials science) ,Geophysics ,Strike-slip tectonics ,Tsunami earthquake ,Geology ,Seabed ,Seismology ,Physics::Geophysics - Abstract
This paper presents a theory of tsunami generation and propagation on a spherically symmetric, self-gravitating, elastic Earth in terms of normal modes. We predict the character of newborn tsunamis at regional far field distances from a simply parameterized moment tensor point source with step function time history. Tsunami eigenfunctions are shown to penetrate the Earth from tens of kilometers at 2, 000 second period to tens of meters at 20 second period. This behavior explains why the longest tsunami periods are the only ones influenced by crust and mantle structure and why they are preferentially excited by submarine earthquakes. We find that large tsunamis need sizable parent earthquakes because over 96% of their energy is concentrated in the ocean. This makes the entire solid Earth virtually a node for tsunami generation. The excitation of tsunami modes is strongly dependent upon the moment, mechanism and depth of faulting. Calculated tsunami energy, ET, can vary by a factor of 100 for sources of equal moment within 30 kilometers of the sea floor. Maximum ET for dip slip and strike slip faulting with moment M0=1020N·m is 1.5×1013 and 1.1×1012 joules. With mechanism and depth fixed, this source model predicts that ET is proportional to M02. The ratio of tsunami to radiated seismic energy is less than 1% for all but the largest events. We believe that this theory coupled with a seismic source recovery technique could be a realistic basis for the forecasting of potentially dangerous tsunamis in real time.
- Published
- 1980
- Full Text
- View/download PDF
43. Long-period reflected and converted upper-mantle phases
- Author
-
Steven N. Ward
- Subjects
Geophysics ,Amplitude ,Geochemistry and Petrology ,Attenuation ,Transition zone ,Equations of motion ,Classification of discontinuities ,Seismogram ,Geology ,Longitudinal wave ,Seismology ,Coda - Abstract
Record sections of long-period seismograms from shallow-focus events at distances between 80° and 120° commonly display significant amounts of spatially coherent energy between the arrivals of P and PP. While individual quality varies, the majority of observations share two characteristics: coherent arrivals have apparent ray parameters and amplitude attenuation rates more comparable to P than PP. The ray-kinematic restraints imposed by these statements suggest wave interactions in the upper mantle near the source or receiver. The dynamics of these phases are investigated by construction of synthetic seismograms for radially symmetric media by contour integration in the complex ray-parameter plane. The generated records containing contributions from waves reflected or converted at upper mantle discontinuities are found to be compatible with observations. The largest of these coda phases on vertical records are near-source-shear converted compressional waves of the types sdP and sdpP. Conjugate near-receiver phases are of less importance than near-source phases on vertical records but they can be expected to dominate on the horizontal component. The effect of transition zone thickness on these phases is determined by direct integration of the equations of motion. For a 10-km-thick transition, reflected amplitudes become negligibly small for waves with periods less than 3 sec. Transmitted converted waves, on the other hand, have significant amplitudes to periods as small as 1 sec. In the long-period passband, peaked near 15 sec, the transition behaves as a sharp interface. These signals are found to be highly sensitive to localized upper mantle structure. Further observations of these phases will provide a direct means of assessing the physical and regional extent of the discontinuities throughout the Earth.
- Published
- 1978
- Full Text
- View/download PDF
44. On elastic wave calculations in a sphere using moment tensor sources
- Author
-
Steven N. Ward
- Subjects
Geophysics ,Classical mechanics ,Geochemistry and Petrology ,Simple (abstract algebra) ,Normal mode ,Displacement field ,Equations of motion ,Boundary value problem ,Expression (computer science) ,Representation (mathematics) ,Excitation ,Mathematics - Abstract
Summary. Propagator matrix solutions to the elastic equations of motion in spherically symmetric, inhomogeneous media with moment tensor sources are recast into a simple and intuitively satisfying form which is applicable to both exact and approximate calculations. The transformed expression benefits from the analogous equations of normal mode excitation, while clearly distinguishing the finer partitions of the displacement field and the more flexible boundary conditions that body wave formulations protide. I believe that this new representation, because of its many advantages, should be favoured as the foundation for elastic wave calculations in a sphere.
- Published
- 1981
- Full Text
- View/download PDF
45. Quasi-static propagator matrices: Creep on strike-slip faults
- Author
-
Steven N. Ward
- Subjects
geography ,geography.geographical_feature_category ,Extrapolation ,Magnitude (mathematics) ,Geometry ,Strain rate ,Fault (geology) ,Strike-slip tectonics ,Viscoelasticity ,Geophysics ,Fault model ,Seismology ,Geology ,Aftershock ,Earth-Surface Processes - Abstract
This paper presents a method for computing viscoelastic flow in a layered Earth by means of quasi-static propagator matrices. The method has advantages over approximate or purely numerical attacks in that exact, semi-analytical solutions are obtained. The procedure enables a more rapid calculation than is possible with finite elements, yet it does not sacrifice exactness as do analytical approximations. To illustrate the technique, I constructed a plausible model of the San Andreas fault and investigated the time and space behavior of displacement, displacement rate, shear-stress and shear-strain rate at depth as well as at the surface. Viscoelastic relaxation speeds the restressing of the fault. For events of magnitude 6.2 and 6.9, viscoelasticity reduces recharge time relative to the base strain rate by 15% and 50% respectively. For events of magnitude 6.2 and less, viscoelasticity has only a small influence and a linear extrapolation of stress accumulation will predict the time of recharge reasonably well. Relative plate velocities measured within 400 km of the fault are highly variable in space and time. Direct plate velocity measurements made as far as 100 km from a major fault could differ by a factor of two from the average rate. Features of the fault model at depth include: stresses and strain rates which exceed surface values by a factor of three; sign reversals in strain rates; and positive coseismic stress drops induced for limited periods in narrow thin zones. The latter feature could initiate and terminate aftershock sequences. Stress recharge does not occur simultaneously at all depths on the fault for all magnitude events. Recurrence times of earthquakes estimated from surface observations may thus be biased.
- Published
- 1985
- Full Text
- View/download PDF
46. On tsunami nucleation
- Author
-
Steven N. Ward
- Subjects
Strike and dip ,geography ,Focal mechanism ,geography.geographical_feature_category ,Physics and Astronomy (miscellaneous) ,Astronomy and Astrophysics ,Near and far field ,Fault (geology) ,Geodesy ,Line source ,Physics::Geophysics ,Geophysics ,Amplitude ,Space and Planetary Science ,Moment (physics) ,Wavenumber ,Geology ,Seismology - Abstract
Tsunami generation on a spherically-symmetric, self-gravitating and elastic Earth is investigated using normal mode summations and moment tensor line sources. Wave cancellation in a line source of length L substantially eliminates tsunamis whose horizontal wavenumber component along the fault exceeds 3π/L. This results in a preferential beaming of tsunami radiation perpendicular to the fault strike. Beaming from a source of fixed length occurs in the same fashion irrespective of.slip mechanism. Tsunami radiation patterns from finite sources are thus more uniform than those from point sources. The largest far field sea waves from any long shallow fault are likely to be found within 20° azimuth of the perpendicular bisecting great circle. Although line sources subordinate distinctions in radiation patterns, focal mechanism is still a critical element in determining the extent of tsunami excitation. The ratio of maximum tsunami amplitude from strike and dip slips of equal moment are 1:10 for faults 400 km in length. Finite faulting weakens strike-slip tsunamis relative to dip-slip tsunamis because the beaming and natural radiation directions of strike slips are severely misaligned. Using an empirical relation linking source process time, fault length and moment, I predict tsunami energies in the range 1010–1017 J for earthquake moments spanning 2 × 1018
- Published
- 1982
- Full Text
- View/download PDF
47. Earthquake mechanisms and tsunami generation: The Kurile Islands event of 13 October 1963
- Author
-
Steven N. Ward
- Subjects
geography ,geography.geographical_feature_category ,Fault (geology) ,Physics::Geophysics ,Moment (mathematics) ,Geophysics ,Amplitude ,Geochemistry and Petrology ,Normal mode ,Surface wave ,Tide gauge ,Tsunami earthquake ,Seismology ,Geology ,Line (formation) - Abstract
Investigations of tsunamigenesis on a spherically symmetric and elastic Earth using normal mode summations and moment tensor line sources are extended to include general double-couple mechanisms so that computed tsunami waveforms can be compared with specific observations. Amplitudes of 19 tide gauge records of the Kurile Islands sea wave of 13 October 1963 are reproduced within a factor of 2 by either a single line source 250 km in length or by a triplet of parallel line sources separated by 42 km. Both models predict a strong beaming of tsunami strength with wave heights toward the northwest and southeast three to eight times larger than those in the northeast and southwest. For faults over 150 km in length, 2 of the 6 moment tensor components dominate tsunami production. This fact motivated the coining of a new parameter, tsunami moment Mt. By cataloging tsunami fields produced by suites of fault mechanisms, I find that Mt is approximately proportional to the tsunamigenic potential of earthquakes. Despite the critical role of fault mechanism in tsunami generation, many sources have nearly equal tsunami moment and nearly equal ability for sea wave excitation. Estimates of shallow source strength from measurements of elastic surface waves have questionable integrity at both long (>100 sec) and short (>30 sec) periods. Amplitudes of high-frequency waves suffer from saturation due to source size and duration, while low-frequency amplitudes are unstable with respect to small perturbations in fault mechanism because of the influence of the Earth's surface. Many tsunami earthquakes may be artifacts of underestimated source strength resulting from the inefficiency of shallow, nearly vertical dip-slip faults to produce long-period surface waves.
- Published
- 1982
- Full Text
- View/download PDF
48. Fault parameters and slip distribution of the 1915 Avezzano, Italy, earthquake derived from geodetic observations
- Author
-
Gianluca Valensise and Steven N. Ward
- Subjects
Focal mechanism ,geography ,geography.geographical_feature_category ,Geodetic datum ,Slip (materials science) ,Fault (geology) ,Fault scarp ,Geodesy ,Tectonics ,Geophysics ,Geochemistry and Petrology ,Quaternary ,Geology ,Seismology ,Holocene - Abstract
This paper analyzes static surface displacements associated with the Avezzano, Italy, earthquake (Ms = 6.9) of 13 January 1915. The Avezzano event locates on a shallow normal fault centered in the Apennine mountains near a Quaternary tectonic depression named “Conca del Fucino.” The 1915 earthquake is the only sizable event to have occurred in the area for at least a millennium, although many Holocene and Quaternary faults can be recognized in the field. Awareness of the seismic potential of the Fucino region has heightened in recent years with the outward expansion of metropolitan Rome (80 km southwest). Because of a fortuitous pre-earthquake leveling survey near the fault in the mid 19th century, good quality geodetic data exist that illuminate details of the 1915 rupture. We modeled the faulting using both uniform (USP) and variable (VSP) slip planar dislocations. The best fitting focal mechanism includes pure dip slip on a plane striking 135° and plunging 63° to the southwest. This fault geometry is consistent with surface scarps and the broad scale tectonics of the region and is common to several large earthquakes of the south-central Apennines. The USP analysis estimates fault length, width, slip and moment to be 24 km, 15 km, 83 cm, and 9.7 × 1018 N-m, respectively. Numerical simulations indicate that residuals left in the uniform slip model are not entirely random and represent systematically unmodeled features of fault slip. VSP models of the earthquake significantly reduce the USP variance and reveal a broad two-lobed slip pattern, separating a central region of low moment release. New formulations detailing the relationships between VSP and minimum model norm solutions to underdetermined inverse problems are presented as well as concise statements dealing with the intrinsic and combined resolving power of a geodetic network.
- Published
- 1989
- Full Text
- View/download PDF
49. A note on lithospheric bending calculations
- Author
-
Steven N. Ward
- Subjects
Body force ,Gravitation ,Physics ,Geophysics ,Classical mechanics ,Geochemistry and Petrology ,Deflection (engineering) ,Equations of motion ,Mechanics ,Boundary value problem ,Bending of plates ,Neutral plane ,Plane stress - Abstract
Summary. This paper derives exact solutions to the equations of static plane strain by means of propagator matrices for homogeneous, gravitating and non- gravitating elastic media. These solutions are immediately verifiable and are flexible under a variety of boundary conditions. Propagator matrices are eminently suitable for computer encoding and, through their multiplication, are applicable to depth-dependent structures. Attention is focused upon the bending of floating plates which are loaded to simulate the deflection of oceanic lithosphere in the vicinity of trenches. By comparing responses com- puted with and without body forces, I find that gravity does not meaning- fully change deflection profiles; however, it can influence important aspects of the internal stress state. Gravitational stresses are proportional to the gradient of the vertical deformation and amount to about 10 per cent of the bending stresses in these models. Propagators which include gravity are used to investigate the effect of regional horizontal stresses upon bending plates. I conclude that applied compressive forces can transport the neutral surface through 10-20 km of depth without significantly deforming the plate profile or increasing the maximum internal stress by more than 30 per cent. These calculations support the contention that variable compressive stresses resulting from interplate coupling could account for observed regional differences in neutral surface height. For elastic-plastic material, the funda- mental equations of motion become non-linear; however, there appears to be no a priori objection to their linearization. I speculate that the propagator formalism, when applied in an iterativeapproach, could be a powerful method for computing the deformation of such media.
- Published
- 1984
- Full Text
- View/download PDF
50. A technique for the recovery of the seismic moment tensor applied to the Oaxaca, Mexico earthquake of November 1978
- Author
-
Steven N. Ward
- Subjects
Azimuth ,Total variation ,Geophysics ,Geochemistry and Petrology ,Seismic moment tensor ,Inversion (meteorology) ,Time domain ,Seismogram ,Geology ,Seismology ,Eigenvalues and eigenvectors ,Statistical hypothesis testing - Abstract
The potential for using seismograms in a linear inversion process to recover the seismic moment tensor has been theorized for sometime. Unfortunately, practical application of the theory is difficult. Lateral variations of the Earth and imprecise knowledge of the source often severely contaminate real seismic data relative to synthetic models which must be supplied. This paper outlines an experiment which minimizes the contamination sufficiently to permit a successful inversion yet retains much of the data's sensitivity to source parameters. Routineness of technique was maintained in data selection and preparation by an unambiguous on-line windowing of P phases from the SRO network. Duplicate preparation of the theoretical P phases assured their synchronization with the observations. A time domain weighted least-squares inversion was implemented including double couple and nondouble couple constraints. Estimates of the total variance in addition to variances of the moment tensor eigenvalues and eigenvectors were returned, forming the basis for statistical tests. Seismic moments of 1.4 ± 0.4, 2.8 ± 0.3, and 2.7 ± 0.2 ×1027 dyne-cm were found for the Oaxaca, Mexico earthquake of November 1978, using 10 P phases in the distance ranges 20 to 120°, 25 to 120°, and 40 to 120°, respectively. Corresponding azimuth and dips of the probable slip vector are 21.5°, 11.8°; 30.3°, 9.3°; and 32.1°, 9.0°. No evidence was found to support the hypothesis that unconstrained source models fit the data significantly better than a double couple. We believe an automated procedure such as this is a viable means for the routine recovery and cataloging of earthquake sources.
- Published
- 1980
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.