211 results on '"Kagan, Yan Y."'
Search Results
2. Earthquake Number Forecasts Testing
- Author
-
Kagan, Yan Y.
- Subjects
Statistics - Applications - Abstract
We study the distributions of earthquake numbers in two global catalogs: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. These distributions are required to develop the number test for forecasts of future seismic activity rate. A common assumption is that the numbers are described by the Poisson distribution. In contrast to the one-parameter Poisson distribution, the negative-binomial distribution (NBD) has two parameters. The second parameter characterizes the clustering or over-dispersion of a process. We investigate the dependence of parameters for both distributions on the catalog magnitude threshold and on temporal subdivision of catalog duration. We find that for most cases of interest the Poisson distribution can be rejected statistically at a high significance level in favor of the NBD. Therefore we investigate whether these distributions fit the observed distributions of seismicity. For this purpose we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogs. A calculation of the NBD skewness and kurtosis levels shows rapid increase of these upper moments levels. However, the observed catalog values of skewness and kurtosis are rising even faster. This means that for small time intervals the earthquake number distribution is even more heavy-tailed than the NBD predicts. Therefore for small time intervals we propose using empirical number distributions appropriately smoothed for testing forecasted earthquake numbers., Comment: 26 pages, 7 figures
- Published
- 2016
- Full Text
- View/download PDF
3. Statistical earthquake focal mechanism forecasts
- Author
-
Kagan, Yan Y. and Jackson, David D.
- Subjects
Physics - Geophysics - Abstract
Forecasts of the focal mechanisms of future earthquakes are important for seismic hazard estimates and Coulomb stress and other models of earthquake occurrence. Here we report on a high-resolution global forecast of earthquake rate density as a function of location, magnitude, and focal mechanism. In previous publications we reported forecasts of 0.5 degree spatial resolution, covering the latitude range magnitude, and focal mechanism. In previous publications we reported forecasts of 0.5 degree spatial resolution, covering the latitude range from -75 to +75 degrees, based on the Global Central Moment Tensor earthquake catalog. In the new forecasts we've improved the spatial resolution to 0.1 degree and the latitude range from pole to pole. Our focal mechanism estimates require distance-weighted combinations of observed focal mechanisms within 1000 km of each grid point. Simultaneously we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms, using the method of Kagan & Jackson proposed in 1994. This average angle reveals the level of tectonic complexity of a region and indicates the accuracy of the prediction. The procedure becomes problematical where longitude lines are not approximately parallel, and where earthquakes are so sparse that an adequate sample spans very large distances. North or south of 75 degrees, the azimuths of points 1000 km away may vary by about 35 degrees. We solved this problem by calculating focal mechanisms on a plane tangent to the earth's surface at each forecast point, correcting for the rotation of the longitude lines at the locations of earthquakes included in the averaging. The corrections are negligible between -30 and +30 degrees latitude, but outside that band uncorrected rotations can be significantly off., Comment: 17 figures
- Published
- 2013
- Full Text
- View/download PDF
4. Double-Couple Earthquake Source: Symmetry and Rotation
- Author
-
Kagan, Yan Y.
- Subjects
Physics - Geophysics ,Mathematical Physics ,Statistics - Applications - Abstract
We consider statistical analysis of double couple (DC) earthquake focal mechanism orientation. The symmetry of DC changes with its geometrical properties, and the number of 3-D rotations one DC source can be transformed into another depends on its symmetry. Four rotations exist in a general case of DC with the nodal-plane ambiguity, two transformations if the fault plane is known, and one rotation if the sides of the fault plane are known. The symmetry of rotated objects is extensively analyzed in statistical material texture studies, and we apply their results to analyzing DC orientation. We consider theoretical probability distributions which can be used to approximate observational patterns of focal mechanisms. Uniform random rotation distributions for various DC sources are discussed, as well as two non-uniform distributions: the rotational Cauchy and von Mises-Fisher. We discuss how parameters of these rotations can be estimated by a statistical analysis of earthquake source properties in global seismicity. We also show how earthquake focal mechanism orientations can be displayed on the Rodrigues vector space., Comment: 40 pages, 14 figures, 1 table
- Published
- 2012
- Full Text
- View/download PDF
5. Characteristic earthquake model, 1884 -- 2011, R.I.P
- Author
-
Kagan, Yan. Y., Jackson, David D., and Geller, Robert J.
- Subjects
Physics - Geophysics ,Statistics - Applications - Abstract
Unfortunately, working scientists sometimes reflexively continue to use "buzz phrases" grounded in once prevalent paradigms that have been subsequently refuted. This can impede both earthquake research and hazard mitigation. Well-worn seismological buzz phrases include "earthquake cycle," "seismic cycle," "seismic gap," and "characteristic earthquake." They all assume that there are sequences of earthquakes that are nearly identical except for the times of their occurrence. If so, the complex process of earthquake occurrence could be reduced to a description of one "characteristic" earthquake plus the times of the others in the sequence. A common additional assumption is that characteristic earthquakes dominate the displacement on fault or plate boundary "segments." The "seismic gap" (or the effectively equivalent "seismic cycle") model depends entirely on the "characteristic" assumption, with the added assumption that characteristic earthquakes are quasi-periodic. However, since the 1990s numerous statistical tests have failed to support characteristic earthquake and seismic gap models, and the 2004 Sumatra earthquake and 2011 Tohoku earthquake both ripped through several supposed segment boundaries. Earthquake scientists should scrap ideas that have been rejected by objective testing or are too vague to be testable., Comment: 7 pages, 1 figure
- Published
- 2012
6. Long- and Short-Term Earthquake Forecasts during the Tohoku Sequence
- Author
-
Kagan, Yan Y. and Jackson, David D.
- Subjects
Physics - Geophysics - Abstract
We consider two issues related to the 2011 Tohoku mega-earthquake: (1) what is the repeat time for the largest earthquakes in this area, and (2) what are the possibilities of numerical short-term forecasts during the 2011 earthquake sequence in the Tohoku area. Starting in 1999 we have carried out long- and short-term forecasts for Japan and the surrounding areas using the GCMT catalog. The forecasts predict the earthquake rate per area, time, magnitude unit and earthquake focal mechanisms. Long-term forecasts indicate that the repeat time for the m9 earthquake in the Tohoku area is of the order of 350 years. We have archived several forecasts made before and after the Tohoku earthquake. The long-term rate estimates indicate that, as expected, the forecasted rate changed only by a few percent after the Tohoku earthquake, whereas due to the foreshocks, the short-term rate increased by a factor of more than 100 before the mainshock event as compared to the long-term rate. After the Tohoku mega-earthquake the rate increased by a factor of more than 1000. These results suggest that an operational earthquake forecasting strategy needs to be developed to take the increase of the short-term rates into account., Comment: 8 Figures, 2 Tables
- Published
- 2012
7. Tohoku earthquake: a surprise?
- Author
-
Kagan, Yan Y. and Jackson, David D.
- Subjects
Physics - Geophysics - Abstract
We consider three questions related to the 2011 Tohoku mega-earthquake: (1) Why was the event size so grossly under-estimated? (2) How should we evaluate the chances of giant earthquakes in subduction zones? and (3) What is the repeat time for magnitude 9 earthquakes off the Tohoku coast? The "maximum earthquake size" is often guessed from the available history of earthquakes, a method known for its significant downward bias. There are two quantitative methods for estimating the maximum magnitude in any region: a statistical analysis of the available earthquake record, and the moment conservation principle. However, for individual zones the statistical method is usually ineffective in estimating the maximum magnitude; only the lower limit can be evaluated. The moment conservation technique matches the tectonic deformation rate to that predicted by earthquakes. For subduction zones, the seismic or historical record is insufficient to constrain either the maximum or corner magnitude. However, the moment conservation principle yields consistent estimates: for all the subduction zones the maximum magnitude is of the order 9.0--9.7. Moreover, moment conservation indicates that variations in estimated maximum magnitude among subduction zones are not statistically significant. Another moment conservation method also suggests that magnitude 9 events are required to explain observed displacement rates. The global rate of magnitude 9 earthquakes in subduction zones, predicted from statistical analysis of seismicity as well as from moment conservation is about five per century -- five actually happened., Comment: 44 pages, 9 figures, 1 table
- Published
- 2011
8. Random stress and Omori's law
- Author
-
Kagan, Yan Y.
- Subjects
Statistics - Applications ,Physics - Geophysics - Abstract
We consider two statistical regularities that were used to explain Omori's law of the aftershock rate decay: the Levy and Inverse Gaussian (IGD) distributions. These distributions are thought to describe stress behavior influenced by various random factors: post-earthquake stress time history is described by a Brownian motion. Both distributions decay to zero for time intervals close to zero. But this feature contradicts the high immediate aftershock level according to Omori's law. We propose that these statistical distributions are influenced by the power-law stress distribution near the earthquake focal zone and we derive new distributions as a mixture of power-law stress with the exponent psi and Levy as well as IGD distributions. Such new distributions describe the resulting inter-earthquake time intervals and closely resemble Omori's law. The new Levy distribution has a pure power-law form with the exponent -(1+psi/2) and the mixed IGD has two exponents: the same as Levy for small time intervals and -(1+psi) for longer times. For even longer time intervals this power-law behavior should be replaced by a uniform seismicity rate corresponding to the long-term tectonic deformation. We compute these background rates using our former analysis of earthquake size distribution and its connection to plate tectonics. We analyze several earthquake catalogs to confirm and illustrate our theoretical results. Finally, we discuss how the parameters of random stress dynamics can be determined through a more detailed statistical analysis of earthquake occurrence or by new laboratory experiments.
- Published
- 2010
- Full Text
- View/download PDF
9. Statistical Distributions of Earthquake Numbers: Consequence of Branching Process
- Author
-
Kagan, Yan Y.
- Subjects
Physics - Geophysics ,Physics - Data Analysis, Statistics and Probability - Abstract
We discuss various statistical distributions of earthquake numbers. Previously we derived several discrete distributions to describe earthquake numbers for the branching model of earthquake occurrence: these distributions are the Poisson, geometric, logarithmic, and the negative binomial (NBD). The theoretical model is the `birth and immigration' population process. The first three distributions above can be considered special cases of the NBD. In particular, a point branching process along the magnitude (or log seismic moment) axis with independent events (immigrants) explains the magnitude/moment-frequency relation and the NBD of earthquake counts in large time/space windows, as well as the dependence of the NBD parameters on the magnitude threshold (magnitude of an earthquake catalog completeness). We discuss applying these distributions, especially the NBD, to approximate event numbers in earthquake catalogs. There are many different representations of the NBD. Most can be traced either to the Pascal distribution or to the mixture of the Poisson distribution with the gamma law. We discuss advantages and drawbacks of both representations for statistical analysis of earthquake catalogs. We also consider applying the NBD to earthquake forecasts and describe the limits of the application for the given equations. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrence, the NBD has two parameters. The second parameter can be used to characterize clustering or over-dispersion of a process. We determine the parameter values and their uncertainties for several local and global catalogs, and their subdivisions in various time intervals, magnitude thresholds, spatial windows, and tectonic categories., Comment: 50 pages,15 Figs, 3 Tables
- Published
- 2009
- Full Text
- View/download PDF
10. Earthquake Size Distribution: Power-Law with Exponent Beta = 1/2?
- Author
-
Kagan, Yan Y.
- Subjects
Physics - Geophysics ,Physics - Data Analysis, Statistics and Probability - Abstract
We propose that the widely observed and universal Gutenberg-Richter relation is a mathematical consequence of the critical branching nature of earthquake process in a brittle fracture environment. These arguments, though preliminary, are confirmed by recent investigations of the seismic moment distribution in global earthquake catalogs and by the results on the distribution in crystals of dislocation avalanche sizes. We consider possible systematic and random errors in determining earthquake size, especially its seismic moment. These effects increase the estimate of the parameter beta of the power-law distribution of earthquake sizes. In particular, we find that estimated beta-values may be inflated by 1-3% because relative moment uncertainties decrease with increasing earthquake size. Moreover, earthquake clustering greatly influences the beta-parameter. If clusters (aftershock sequences) are taken as the entity to be studied, then the exponent value for their size distribution would decrease by 5-10%. The complexity of any earthquake source also inflates the estimated beta-value by at least 3-7%. The centroid depth distribution also should influence the beta-value, an approximate calculation suggests that the exponent value may be increased by 2-6%. Taking all these effects into account, we propose that the recently obtained beta-value of 0.63 could be reduced to about 0.52--0.56: near the universal constant value (1/2) predicted by theoretical arguments. We also consider possible consequences of the universal beta-value and its relevance for theoretical and practical understanding of earthquake occurrence in various tectonic and Earth structure environments. Using comparative crystal deformation results may help us understand the generation of seismic tremors and slow earthquakes and illuminate the transition from brittle fracture to plastic flow., Comment: 46 pages, 2 tables, 11 figures 53 pages, 2 tables, 12 figures
- Published
- 2009
- Full Text
- View/download PDF
11. Characteristic Earthquakes and Seismic Gaps
- Author
-
Jackson, David D., primary and Kagan, Yan Y., additional
- Published
- 2021
- Full Text
- View/download PDF
12. Testing long-term earthquake forecasts: likelihood methods and error diagrams
- Author
-
Kagan, Yan Y.
- Subjects
Physics - Data Analysis, Statistics and Probability ,Physics - Geophysics - Abstract
We propose a new method to test the effectiveness of a spatial point process forecast based on a log-likelihood score for predicted point density and the information gain for events that actually occurred in the test period. The method largely avoids simulation use and allows us to calculate the information score for each event or set of events as well as the standard error of each forecast. As the number of predicted events increases, the score distribution approaches the Gaussian law. The degree of its similarity to the Gaussian distribution can be measured by the computed coefficients of skewness and kurtosis. To display the forecasted point density and the point events, we use an event concentration diagram or a variant of the Error Diagram (ED). We demonstrate the application of the method by using our long-term forecast of seismicity in two western Pacific regions. We compare the ED for these regions with simplified diagrams based on two-segment approximations. Since the earthquakes in these regions are concentrated in narrow subduction belts, using the forecast density as a template or baseline for the ED is a more convenient display technique. We also show, using simulated event occurrence, that some proposed criteria for measuring forecast effectiveness at EDs would be strongly biased for a small event number., Comment: 31 pages text, 3 tables, 10 figures
- Published
- 2008
- Full Text
- View/download PDF
13. On geometric complexity of earthquake focal zone and fault system: A statistical study
- Author
-
Kagan, Yan Y.
- Subjects
Physics - Geophysics ,Physics - Data Analysis, Statistics and Probability - Abstract
We discuss various methods used to investigate the geometric complexity of earthquakes and earthquake faults, based both on a point-source representation and the study of interrelations between earthquake focal mechanisms. We briefly review the seismic moment tensor formalism and discuss in some detail the representation of double-couple (DC) earthquake sources by normalized quaternions. Non-DC earthquake sources like the CLVD focal mechanism are also considered. We obtain the characterization of the earthquake complex source caused by summation of disoriented DC sources. We show that commonly defined geometrical fault barriers correspond to the sources without any CLVD component. We analyze the CMT global earthquake catalog to examine whether the focal mechanism distribution suggests that the CLVD component is likely to be zero in tectonic earthquakes. Although some indications support this conjecture, we need more extensive and significantly more accurate data to answer this question fully., Comment: 53 pages text, 12 figures
- Published
- 2008
- Full Text
- View/download PDF
14. Earthquake size distribution: power-law with exponent beta=1/2 ?
- Author
-
Kagan, Yan Y
- Subjects
Gutenberg-Richter relation ,Corner moment ,Tapered Pareto distribution ,Scalar and tensor seismic moment ,Universality of earthquake size distribution ,3-D random rotation ,Earthquake depth distribution ,Seismic tremors ,Transition from brittle to plastic deformation - Abstract
We propose that the widely observed and universal Gutenberg-Richter relation is a mathematical consequence of the critical branching nature of earthquake process in a brittle fracture environment. These arguments, though preliminary, are confirmed by recent investigations of the seismic moment distribution in global earthquake catalogs and by the results on the distribution in crystals of dislocation avalanche sizes. We consider possible systematic and random errors in determining earthquake size, especially its seismic moment. These effects increase the estimate of the parameter beta of the power-law distribution of earthquake sizes. In particular, we find that estimated beta-values may be inflated by 1-3% because relative moment uncertainties decrease with increasing earthquake size. Moreover, earthquake clustering greatly influences the beta-parameter. If clusters (aftershock sequences) are taken as the entity to be studied, then the exponent value for their size distribution would decrease by 5-10%. The complexity of any earthquake source also inflates the estimated beta-value by at least 3-7%. The centroid depth distribution also should influence the beta-value, an approximate calculation suggests that the exponent value may be increased by 2-6%. Taking all these effects into account, we propose that the recently obtained beta-value of 0.63 could be reduced to about 0.52--0.56: near the universal constant value (1/2) predicted by theoretical arguments. We also consider possible consequences of the universal beta-value and its relevance for theoretical and practical understanding of earthquake occurrence in various tectonic and Earth structure environments. Using comparative crystal deformation results may help us understand the generation of seismic tremors and slow earthquakes and illuminate the transition from brittle fracture to plastic flow.
- Published
- 2010
15. Statistical distributions of earthquake numbers: consequence of branching process
- Author
-
Kagan, Yan Y
- Subjects
Probability distributions ,Statistical seismology ,Theointeraction ,forecasting ,and predictionretical seismology ,North America ,Probabilistic forecasting ,Earthquake - Abstract
We discuss various statistical distributions of earthquake numbers. Previously we derived several discrete distributions to describe earthquake numbers for the branching model of earthquake occurrence: these distributions are the Poisson, geometric, logarithmic, and the negative binomial (NBD). The theoretical model is the `birth and immigration' population process. The first three distributions above can be considered special cases of the NBD. In particular, a point branching process along the magnitude (or log seismic moment) axis with independent events (immigrants) explains the magnitude/moment-frequency relation and the NBD of earthquake counts in large time/space windows, as well as the dependence of the NBD parameters on the magnitude threshold (magnitude of an earthquake catalogue completeness). We discuss applying these distributions, especially the NBD, to approximate event numbers in earthquake catalogues. There are many different representations of the NBD. Most can be traced either to the Pascal distribution or to the mixture of the Poisson distribution with the gamma law. We discuss advantages and drawbacks of both representations for statistical analysis of earthquake catalogues. We also consider applying the NBD to earthquake forecasts and describe the limits of the application for the given equations. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrence, the NBD has two parameters. The second parameter can be used to characterize clustering or over-dispersion of a process. We determine the parameter values and their uncertainties for several local and global catalogues, and their subdivisions in various time intervals, magnitude thresholds, spatial windows, and tectonic categories. The theoretical model of how the clustering parameter depends on the corner (maximum) magnitude can be used to predict future earthquake number distribution in regions where very large earthquakes have not yet occurred.
- Published
- 2010
16. Testing long-term earthquake forecasts: likelihood methods and error diagrams
- Author
-
Kagan, Yan Y
- Subjects
Probabilistic forecasting ,Spatial analysis ,Fractals and multifractals ,Probability distributions ,Earthquake interaction ,forecasting ,and prediction ,Statistical seismology - Abstract
We propose a new method to test the performance of a spatial point process forecast based on a log-likelihood score for predicted point density and the information gain for events that actually occurred in the test period. The method largely avoids simulation use and allows us to calculate the information score for each event or set of events as well as the standard error of each forecast. As the number of predicted events increases, the score distribution approaches the Gaussian law. The degree of its similarity to the Gaussian distribution can be measured by the computed coefficients of skewness and kurtosis. To display the forecasted point density and the point events, we use an event concentration diagram or a variant of the Error Diagram (ED). We find forward relation between the error diagram curve and the information score as well as inverse relation for one simple model of point spatial fields. We again show that the error diagram is more informative than the likelihood ratio.We demonstrate the application of the method by using our long-term forecast of seismicity in two western Pacific regions. We compare the ED for these regions with simplified diagrams based on two-segment approximations. Since the earthquakes in these regions are concentrated in narrow subduction belts, using the forecast density as a template or baseline for the ED is a more convenient display technique. We also show, using simulated event occurrence, that some proposed criteria for measuring forecast effectiveness at EDs would be strongly biased for a small event number.
- Published
- 2009
17. On the geometric complexity of earthquake focal zone and fault systems: A statistical study
- Author
-
Kagan, Yan Y
- Subjects
Earthquake focal mechanism ,double couple ,CLVD ,quaternion ,geometric barriers ,statistical analysis - Abstract
We discuss various methods used to investigate the geometric complexity of earthquakes and earthquake faults, based both on a point-source representation and the study of interrelations between earthquake focal mechanisms. We briefly review the seismic moment tensor formalism and discuss in some detail the representation of double-couple (DC) earthquake sources by normalized quaternions. Non-DC earthquake sources like the CLVD focal mechanism are also considered. We obtain the characterization of the earthquake complex source caused by summation of disoriented DC sources. We show that commonly defined geometrical fault barriers correspond to sources without any CLVD component. We analyze the CMT global earthquake catalog to examine whether the focal mechanism distribution suggests that the CLVD component is likely to be zero in tectonic earthquakes. Although some indications support this conjecture, we need more extensive and significantly more accurate data to answer this question fully.
- Published
- 2009
18. Simplified algorithms for calculating double-couple rotation
- Author
-
Kagan, Yan Y.
- Subjects
earthquake-source mechanism ,fault-plane solutions ,inverse problem ,seismic moment ,seismotectonics ,statistical methods - Abstract
We derive new, simplified formulae for evaluating the 3-D angle of earthquake double-couple (DC) rotation. The complexity of the derived equations depends on both accuracy requirements for angle evaluation and the completeness of desired solutions. The solutions are simpler than my previously proposed algorithm based on the quaternion representation designed in 1991. We discuss advantages and disadvantages of both approaches. These new expressions can be written in a few lines of computer code and used to compare both DC solutions obtained by different methods and variations of earthquake focal mechanisms in space and time.
- Published
- 2007
19. On earthquake predictability measurement: Information score and error diagram
- Author
-
Kagan, Yan Y.
- Subjects
earthquake prediction ,statistical methods ,seismicity ,renewal processes ,information score ,error diagram - Abstract
We discuss two methods for measuring the effectiveness of earthquake prediction algorithms: The information score based on the likelihood ratio and error diagrams. For both of these methods, closed form expressions are obtained for the renewal process based on the gamma and lognormal distributions. The error diagram is more informative than the likelihood ratio and uniquely specifies the information score. We derive an expression connecting the information score and error diagrams. We then obtain the estimate of the region bounds in the error diagram for any value of the information score. We discuss how these preliminary results can be extended for more realistic models of earthquake occurrence.
- Published
- 2007
20. Earthquake spatial distribution: the correlation dimension
- Author
-
Kagan, Yan Y
- Subjects
earthquake location ,fractals ,geostatistics ,seismicity ,statistical methods ,synthetic-earthquake catalogues - Abstract
We review methods for determining the fractal dimensions of earthquake epicentres and hypocentres, paying special attention to the problem of errors, biases and systematic effects. Among effects considered are earthquake location errors, boundary effects, inhomogeneity of depth distribution and temporal dependence. In particular, the correlation dimension of earthquake spatial distribution is discussed, techniques for its evaluation presented, and results for several earthquake catalogues are analysed. We show that practically any value for the correlation dimension can be obtained if many errors and inhomogeneities in observational data as well as deficiencies in data processing are not properly considered. It is likely that such technical difficulties are intensified when one attempts to evaluate multifractal measures of dimension. Taking into account possible errors and biases, we conclude that the fractal dimension for shallow seismicity asymptotically approaches 2.20 +/- 0.05 for a catalogue time span of decades and perhaps centuries. The value of the correlation dimension declines to 1.8-1.9 for intermediate events (depth interval 71-280 km) and to 1.5-1.6 for deeper ones.
- Published
- 2007
21. Characteristic Earthquakes and Seismic Gaps
- Author
-
Jackson, David D., primary and Kagan, Yan Y., additional
- Published
- 2019
- Full Text
- View/download PDF
22. Regression problems for magnitudes
- Author
-
Castellaro, Silvia, Mulargia, Francesco, and Kagan, Yan Y
- Subjects
magnitude conversion ,orthogonal regression ,seismology ,statistical methods - Abstract
Least-squares linear regression is so popular that it is sometimes applied without checking whether its basic requirements are satisfied. In particular, in studying earthquake phenomena, the conditions (a) that the uncertainty on the independent variable is at least one order of magnitude smaller than the one on the dependent variable, (b) that both data and uncertainties are normally distributed and (c) that residuals are constant are at times disregarded. This may easily lead to wrong results. As an alternative to least squares, when the ratio between errors on the independent and the dependent variable can be estimated, orthogonal regression can be applied. We test the performance of orthogonal regression in its general form against Gaussian and non-Gaussian data and error distributions and compare it with standard least-square regression. General orthogonal regression is found to be superior or equal to the standard least squares in all the cases investigated and its use is recommended. We also compare the performance of orthogonal regression versus standard regression when, as often happens in the literature, the ratio between errors on the independent and the dependent variables cannot be estimated and is arbitrarily set to 1. We apply these results to magnitude scale conversion, which is a common problem in seismology, with important implications in seismic hazard evaluation, and analyse it through specific tests. Our analysis concludes that the commonly used standard regression may induce systematic errors in magnitude conversion as high as 0.3-0.4, and, even more importantly, this can introduce apparent catalogue incompleteness, as well as a heavy bias in estimates of the slope of the frequency-magnitude distributions. All this can be avoided by using the general orthogonal regression in magnitude conversions.
- Published
- 2006
23. Comment on `Testing earthquake prediction methods: "The West Pacific short-term forecast of earthquakes with magnitude MwHRV >= 5.8"' by V. G. Kossobokov
- Author
-
Kagan, Yan Y and Jackson, David D
- Abstract
In his paper Kossobokov investigates the efficiency of our short-term forecast for two western Pacific regions. Although we agree with the basic results of his evaluation that the forecast statistics is much better than a random guess, we have reservations about his definition of earthquake prediction, some of his tests, and his interpretation of the test results. We distinguish between deterministic earthquake predictions and statistical forecasts. We argue that some techniques used by Kossobokov may not be appropriate for testing our forecasts and discuss other testing methods, based on the likelihood function. We demonstrate that Kossobokov's null hypothesis may be biased, and this bias can influence some of his conclusions. We show that contrary to Kossobokov's statement, our algorithm predicts mainshocks when they are preceded by foreshocks.
- Published
- 2006
24. Why does theoretical physics fail to explain and predict earthquake occurrence?
- Author
-
Kagan, Yan Y
- Subjects
Earthquakes ,seismicity ,earthquake size ,time ,space and focal mechanisms ,scale-invariance ,stochastic point process ,statistical analysis ,quaternions ,earthquake forecasting - Abstract
Several reasons for the failure can be proposed: 1. The multidimensional character of seismicity: time, space, and earthquake focal mechanism need to be modeled. The latter is a symmetric second-rank tensor of a special kind. 2. The intrinsic randomness of earthquake occurrence, necessitating the use of stochastic point processes and appropriate complex statistical techniques. 3. The scale-invariant or fractal properties of earthquake processes; the theory of random stable or heavy-tailed variables is significantly more difficult than that of Gaussian variables and is only now being developed. Earthquake process theory should be capable of being renormalized. 4. Statistical distributions of earthquake sizes, earthquake temporal interactions, spatial patterns and focal mechanisms are largely universal. The values of major parameters are similar for earthquakes in various tectonic zones. The universality of these distributions will enable a better foundation for earthquake process theory. 5. The quality of current earthquake data statistical analysis is low. Since little or no study of random and systematic errors is performed, most published statistical results are artifacts. 6. During earthquake rupture, propagation focal mechanisms sometimes undergo large 3-D rotations. These rotations require non-commutative algebra (e.g., quaternions and gauge theory) for accurate models of earthquake occurrence. 7. These phenomenological and theoretical difficulties are not limited to earthquakes: any fracture of brittle materials, tensile or shear, would encounter similar problems.
- Published
- 2006
25. Relation between mainshock rupture process and Omori's law for aftershock moment release rate
- Author
-
Kagan, Yan Y and Houston, H
- Subjects
aftershocks ,rupture propagation ,seismic coda ,seismic-event rates ,seismic moment ,statistical methods - Abstract
We compare the source time functions (i.e., moment release rates) of three large California mainshocks with the seismic moment release rates during their aftershock sequences. Aftershock moment release rates, computed by summing aftershock moments in time intervals, follow a power-law time dependence similar to Omori's law from minutes to months after the mainshock; furthermore, in contrast to the previously observed saturation in numbers of aftershocks shortly after the mainshock rupture, no such saturation is seen in the aftershock moment release rates, which are dominated by the largest aftershocks. We argue that the observed saturation in aftershock numbers described by the 'time offset' parameter c in Omori's law is likely an artefact due to the underreporting of small aftershocks, which is related to the difficulty of detecting large numbers of small aftershocks in the mainshock coda. We further propose that it is more natural for c to be negative (i.e. singularity follows the onset of mainshock rupture) than positive (singularity precedes onset of rupture). To make a more general comparison of mainshock rupture process and aftershock moment rates, we then scale mainshock time functions to equalize the effects of the varied seismic moments. For the three California mainshocks, we compare the scaled time functions with similarly scaled aftershock moment rates. Finally, we compare global averages of scaled time functions of many shallow events to the average scaled aftershock moment release rate for six California mainshocks. In each of these comparisons, the extrapolation, using Omori's law, of the aftershock moment rates back in time to the onset of the mainshock rupture indicates that the temporal intensity of the aftershock moment release is about 1.5 orders of magnitude less than the maximum reached by the mainshock rupture. This may be due to the differing amplitudes and relative importance of static and dynamic stresses in aftershock initiation compared to mainshock rupture propagation.
- Published
- 2005
26. Double-couple earthquake focal mechanism: random rotation and display
- Author
-
Kagan, Yan Y
- Subjects
earthquake-source mechanism ,fault-plane solutions ,seismic moment ,seismotectonics ,statistical methods - Abstract
This paper addresses two problems: the random rotation of double-couple (DC) earthquake sources and the display of earthquake focal mechanisms. We consider several equivalent representations for DC sources and their properties and provide mathematical expressions for their mutual transformation. Obviously, a 3-D rotation of any object is more intricate than a 2-D rotation. Any rotation of a DC source is further complicated by its symmetry properties. Applying statistical tests to a DC distribution often requires one to compare it to a completely (or uniform) random DC pattern. We review several methods for obtaining random distribution of DC orientation; some of these seemingly natural techniques yield an incorrect result. The DC random rotation problem is closely connected to displays of focal mechanisms. In such displays, a strike or an azimuth of a focal mechanism can be neglected; hence, we are confronted with mapping a 2-D distribution onto a flat surface. We review different methods for such displays and discuss more specifically how to project a random focal mechanism distribution on a flat 2-D display with uniform probability density. Such displays can be used to analyse earthquake patterns statistically in various tectonic regions.
- Published
- 2005
27. Stress and earthquakes in southern California, 1850-2004
- Author
-
Kagan, Yan Y, Jackson, D D, and Liu, Z
- Abstract
[1] We compute the stress tensor in the upper crust of southern California as a function of time and compare observed seismicity with the estimated stress at the time of each earthquake. Several recent developments make it possible to do this much more realistically than before: ( 1) a wealth of new geodetic and geologic data for southern California and ( 2) a catalog of moment tensors for all earthquakes with magnitudes larger than 6 since 1850 and larger than 5 since 1910. We model crustal deformation using both updated geodetic data and geologically determined fault slip rates. We subdivide the crust into elastic blocks, delineated by faults which move freely at a constant rate below a locking depth with a rate determined by the relative block motion. We compute normal and shear stresses on nodal planes for each earthquake in the catalog. We consider stress increments from previous earthquakes ("seismic stress'') and aseismic tectonic stress, both separately and in combination. The locations and mechanisms of earthquakes are best correlated with the aseismic shear stress. Including the cumulative coseismic effects from past earthquakes does not significantly improve the correlation. Correlations between normal stress and earthquakes are always very sensitive to the start date of the catalog, whether we exclude earthquakes very close to others and whether we evaluate stress at the hypocenter or throughout the rupture surface of an earthquake. Although the correlation of tectonic stress with earthquake triggering is robust, other results are unstable apparently because the catalog has so few earthquakes.
- Published
- 2005
28. Earthquake slip distribution: A statistical model
- Author
-
Kagan, Yan Y
- Abstract
[1] The purpose of this paper is to interpret slip statistics in a framework of extended earthquake sources. We first discuss deformation pattern of the Earth's surface from earthquakes and suggest that continuum versus block motion controversy can be reconciled by a model of the fractal distribution of seismic sources. We consider earthquake slip statistical distributions as they can be inferred from seismic moment-frequency relations and geometrical scaling for earthquakes. Using various assumptions on temporal earthquake occurrence, these distributions are synthesized to evaluate the accuracy of geologic fault slip determinations and to estimate uncertainties in long-term earthquake patterns based on paleoseismic data. Because the seismic moment distribution is a power law (Pareto), a major part of the total seismic moment is released by major earthquakes, M >= 10(19.5) N m ( moment magnitude m >= 7); for these large earthquakes the rupture is confined to the upper brittle crust layer. We review the various moment-frequency and earthquake scaling relationships and apply them to infer the slip distribution at area- and site-specific regions. Simulating the seismic moment and strain accumulation process demonstrates that some synthetics can be interpreted as examples of a quasiperiodic sequence. We demonstrate the application of the derived slip statistical relations by analyzing the slip distribution and history of the San Andreas fault at Wrightwood, California.
- Published
- 2005
29. Importance of small earthquakes for stress transfers and earthquake triggering
- Author
-
Helmstetter, Agnes, Kagan, Yan Y, and Jackson, David D
- Abstract
[1] We estimate the relative importance of small and large earthquakes for static stress changes and for earthquake triggering, assuming that earthquakes are triggered by static stress changes and that earthquakes are located on a fractal network of dimension D. This model predicts that both the number of events triggered by an earthquake of magnitude m and the stress change induced by this earthquake at the location of other earthquakes increase with m as similar to 10(Dm/2). The stronger the spatial clustering, the larger the influence of small earthquakes on stress changes at the location of a future event as well as earthquake triggering. If earthquake magnitudes follow the Gutenberg-Richter law with b > D/2, small earthquakes collectively dominate stress transfer and earthquake triggering because their greater frequency overcomes their smaller individual triggering potential. Using a southern California catalog, we observe that the rate of seismicity triggered by an earthquake of magnitude m increases with m as 10(alpha m), where alpha = 1.05 +/- 0.05. We also find that the magnitude distribution of triggered earthquakes is independent of the triggering earthquake's magnitude m. When alpha approximate to b, small earthquakes are roughly as important to earthquake triggering as larger ones. We evaluate the fractal correlation dimension D of hypocenters using two relocated catalogs for southern California. The value of D measured for distances 0.1 < r < 5 km is D = 1.54 for the Shearer et al. catalog and D = 1.73 for the Hauksson et al. catalog. The value of D reflects both the structure of the fault network and the nature of earthquake interactions. By considering only those earthquake pairs with interevent times larger than 1000 days, we can largely remove the effects of short-term clustering. Then D approximate to 2, close to the value D = 2 alpha = 2.1 predicted by assuming that earthquake triggering is due to static stress. The value D approximate to 2b implies that small earthquakes are as important as larger ones for stress transfers between earthquakes and that considering stress changes induced by small earthquakes should improve models of earthquake interactions.
- Published
- 2005
30. Approximating the Distribution of Pareto Sums
- Author
-
Zaliapin, Ilya, Kagan, Yan Y, and Schoenberg, Federic P.
- Subjects
Pareto distribution ,Pareto truncated distribution ,Seismic moment distribution - Abstract
Heavy tailed random variables (rvs) have proven to be an essential element in modeling a wide variety of natural and human induced processes, and the sums of heavy tailed rvs represent a particularly important construct in such models. Oriented toward both geophysical and statistical audiences, this paper discusses the appearance of the Pareto law in seismology and addresses the problem of the statistical approximation for the sums of independent rvs with common Pareto distribution F(x)=1 - xα for 1/2 < α < 2. Such variables have infinite second moment which prevents one from using the Central Limit Theorem to solve the problem. This paper presents five approximation techniques for the Pareto sums and discusses their respective accuracy. The main focus is on the median and the upper and lower quantiles of the sum?s distribution. Two of the proposed approximations are based on the Generalized Central Limit Theorem, which establishes the general limit for the sums of independent identically distributed rvs in terms of stable distributions; these approximations work well for large numbers of summands. Another approximation, which replaces the sum with its maximal summand, has less than 10% relative error for the upper quantiles when α < 1. A more elaborate approach considers the two largest observations separately from the rest of the observations, and yields a relative error under 1% for the upper quantiles and less than 5% for the median. The last approximation is specially tailored for the lower quantiles, and involves reducing the non-Gaussian problem to its Gaussian equivalent; it too yields errors less than 1%. Approximation of the observed cumulative seismic moment in California illustrates developed methods.
- Published
- 2003
31. Seismic gaps and earthquakes
- Author
-
Rong, Yufang F, Jackson, David D, and Kagan, Yan Y
- Subjects
seismic gap ,earthquakes ,characteristic earthquakes ,statistical test ,likelihood ,circum Pacific - Abstract
[1] McCann et al. [1979] published a widely cited "seismic gap'' model ascribing earthquake potential categories to 125 zones surrounding the Pacific Rim. Nishenko [ 1991] published an updated and revised version including probability estimates of characteristic earthquakes with specified magnitudes within each zone. These forecasts are now more than 20 and 10 years old, respectively, and sufficient data now exist to test them rather conclusively. For the McCann et al. forecast, we count the number of qualifying earthquakes in the several categories of zones. We assume a hypothetical probability consistent with the gap model ( e. g., red zones have twice the probability of green zones) and test against the null hypothesis that all zones have equal probability. The gap hypothesis can be rejected at a high confidence level. Contrary to the forecast of McCann et al., the data suggest that the real seismic potential is lower in the gaps than in other segments, and plate boundary zones are not made safer by recent earthquakes. For the 1991 Nishenko hypothesis, we test the number of filled zones, the likelihood scores of the observed and simulated catalogs, and the likelihood ratio of the gap hypothesis to a Poissonian null hypothesis. For earthquakes equal to or larger than the characteristic magnitude, the new seismic gap hypothesis failed at the 95% confidence level in both the number and ratio tests. If we lower the magnitude threshold by 0.5 for qualifying earthquakes, the new gap hypothesis passes the number test but fails in both the likelihood and likelihood ratio tests at the 95% confidence level.
- Published
- 2003
32. Accuracy of modem global earthquake catalogs
- Author
-
Kagan, Yan Y
- Subjects
earthquake catalogs ,catalog completeness ,origin time and location errors ,magnitude accuracy ,earthquake focal mechanisms and their uncertainties - Abstract
We compare several modem (1977-present) worldwide earthquake catalogs to infer their completeness, earthquake origin time and hypocenter location accuracy, magnitude/scalar seismic moment errors, and difference between individual focal mechanism/moment tensor solutions. The Harvard centroid moment tensor (CMT), US Geological Survey (USGS) MT, USGS first-motion (FM) focal mechanism, PDE and ISC catalogs have been analyzed and compared. The catalogs' completeness and accuracy vary in time and depend on earthquake depth and tectonic environment. We propose a new statistical method for evaluating catalog completeness and show the results for the CMT dataset. A difference in frequency range of seismic waves used in earthquake processing leads to varying degrees of catalog completeness for foreshocks and aftershocks close in time. Earthquake origin time versus centroid time as well as hypocenter location versus centroid location can be explained well by earthquake scaling relations. Comparing moment magnitudes and regular earthquake magnitudes yields estimated magnitude uncertainties and shows that latter magnitudes poorly estimate earthquake size for large events. Moment errors reported in the CMT solutions are well correlated with the CMT/GS-MT magnitude difference, and hence indicate magnitude uncertainty well. A normalized seismic moment tensor has 4 d.f. and its accuracy can be represented as the non-double-couple (non-DC) component-value, the 3-D angle (Phi) of DC source rotation, and a position of the rotation pole. Our results suggest that a routinely determined non-DC component is in most cases only an artifact. The distribution of the Phi-value varies over catalog time, earthquake depth, focal mechanism, and magnitude. The seismic moment errors and the value of the non-DC component are indicative of the Phi-value; for the best solutions, the 3-D angle in the CMT catalog is on the order of 5-7degrees. The CMT catalog is obviously the best dataset in completeness and accuracy of its detailed solutions. Our results specifying uncertainties and completeness of global earthquake catalogs, can be used in studies of geodynamic processes, tectonic deformation associated with earthquakes, earthquake stress analysis and in many other applications of earthquake catalog data. Seismogram interpretation techniques can be reviewed and possibly revised in light of these results. (C) 2002 Elsevier Science B.V. All rights reserved.
- Published
- 2003
33. Cannot Earthquakes be Predicted?
- Author
-
Wyss, Max, Aceves, Richard L., Park, Stephen K., Geller, Robert J., Jackson, David D., Kagan, Yan Y., and Mulargia, Francesco
- Published
- 1997
34. Estimation of the upper cutoff parameter for the tapered Pareto distribution
- Author
-
Kagan, Yan Y and Schoenberg, Frederick P.
- Subjects
Parameter estimation ,tapered Pareto distribution ,Gutenberg--Richter relation ,maximum likelihood estimation ,method of moments ,earthquakes - Abstract
The tapered (or generalized) Pareto distribution, also called the modified Gutenberg--Richter law, has been used to model the sizes of earthquakes. Unfortunately, maximum likelihood estimates of the cutoff parameter are substantially biased. Alternative estimates for the cutoff parameter are presented, and their properties discussed.
- Published
- 2001
35. Worldwide earthquake forecasts
- Author
-
Kagan, Yan Y.
- Published
- 2017
- Full Text
- View/download PDF
36. Characteristic Earthquakes and Seismic Gaps
- Author
-
Jackson, David D., Kagan, Yan Y., and Gupta, Harsh K., editor
- Published
- 2011
- Full Text
- View/download PDF
37. Earthquakes
- Author
-
Kagan, Yan Y., primary
- Published
- 2014
- Full Text
- View/download PDF
38. Plate Tectonics and Earthquake Potential of Spreading Ridges and Oceanic Transform Faults
- Author
-
Bird, Peter, primary, Kagan, Yan Y., additional, and Jackson, David D., additional
- Published
- 2013
- Full Text
- View/download PDF
39. Characteristic Earthquakes and Seismic Gaps
- Author
-
Jackson, David D., primary and Kagan, Yan Y., additional
- Published
- 2011
- Full Text
- View/download PDF
40. Earthquakes Cannot Be Predicted
- Author
-
Geller, Robert J., Jackson, David D., Kagan, Yan Y., and Mulargia, Francesco
- Published
- 1997
41. Earthquake number forecasts testing
- Author
-
Kagan, Yan Y., primary
- Published
- 2017
- Full Text
- View/download PDF
42. Comment on 'Testing earthquake prediction methods: 'The West Pacific short-term forecast of earthquakes with magnitude MwHRV >= 5.8'' by V. G. Kossobokov
- Author
-
Kagan, Yan Y, Kagan, Yan Y, Jackson, David D, Kagan, Yan Y, Kagan, Yan Y, and Jackson, David D
- Abstract
In his paper Kossobokov investigates the efficiency of our short-term forecast for two western Pacific regions. Although we agree with the basic results of his evaluation that the forecast statistics is much better than a random guess, we have reservations about his definition of earthquake prediction, some of his tests, and his interpretation of the test results. We distinguish between deterministic earthquake predictions and statistical forecasts. We argue that some techniques used by Kossobokov may not be appropriate for testing our forecasts and discuss other testing methods, based on the likelihood function. We demonstrate that Kossobokov's null hypothesis may be biased, and this bias can influence some of his conclusions. We show that contrary to Kossobokov's statement, our algorithm predicts mainshocks when they are preceded by foreshocks.
- Published
- 2006
43. Worldwide earthquake forecasts
- Author
-
Kagan, Yan Y., primary
- Published
- 2016
- Full Text
- View/download PDF
44. Earthquake rate and magnitude distributions of great earthquakes for use in global forecasts
- Author
-
Kagan, Yan Y., primary and Jackson, David D., additional
- Published
- 2016
- Full Text
- View/download PDF
45. Likelihood analysis of earthquake focal mechanism distributions
- Author
-
Kagan, Yan Y., primary and Jackson, David D., additional
- Published
- 2015
- Full Text
- View/download PDF
46. Statistical earthquake focal mechanism forecasts
- Author
-
Kagan, Yan Y., primary and Jackson, David D., additional
- Published
- 2014
- Full Text
- View/download PDF
47. Double-couple earthquake source: symmetry and rotation
- Author
-
Kagan, Yan Y., primary
- Published
- 2013
- Full Text
- View/download PDF
48. Whole Earth high-resolution earthquake forecasts
- Author
-
Kagan, Yan Y., primary and Jackson, David D., additional
- Published
- 2012
- Full Text
- View/download PDF
49. Random stress and Omori's law
- Author
-
Kagan, Yan Y., primary
- Published
- 2011
- Full Text
- View/download PDF
50. Global earthquake forecasts
- Author
-
Kagan, Yan Y., primary and Jackson, David D., additional
- Published
- 2010
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.