182 results
Search Results
2. Expert elicitation of the timing and uncertainty to establish a geologic sequestration well for CO2 in the United States.
- Author
-
Moore, Emily J., Karplus, Valerie J., and Morgan, M. Granger
- Subjects
CARBON sequestration ,DISTRIBUTION (Probability theory) ,CARBON dioxide mitigation ,UNITED States economy - Abstract
Many studies anticipate that carbon capture and sequestration (CCS) will be essential to decarbonizing the U.S. economy. However, prior work has not estimated the time required to develop, approve, and implement a geologic sequestration site in the United States. We generate such an estimate by identifying six clearance points that must be passed before a sequestration site can become operational. For each clearance point (CP), we elicit expert judgments of the time required in the form of probability distributions and then use stochastic simulation to combine and sum the results. We find that, on average, there is a 90% chance that the time required lies between 5.5 and 9.6 y, with an upper bound of 12 y. Even using the most optimistic expert judgements, the lower bound on time is 2.7 y, and the upper bound is 8.3 y. Using the most pessimistic judgements, the lower bound is 3.5 y and the upper bound is 19.2 y. These estimates suggest that strategies must be found to safely accelerate the process. We conclude the paper by discussing seven potential strategies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Portfolio value‐at‐risk estimation for spot chartering decisions under changing trade patterns: A copula approach.
- Author
-
Bai, Xiwen and Lam, Jasmine Siu Lee
- Subjects
VALUE at risk ,DISTRIBUTION (Probability theory) ,CHINA-United States relations ,PORTFOLIO management (Investments) ,GARCH model - Abstract
Evolving geopolitical relationships between countries (especially between China and the United States) in recent years have highlighted dynamically changing trade patterns across the globe, all of which elevate risk and uncertainty for transport service providers. In order to mitigate risks, shipowners and operators must be able to estimate risks appropriately; one potentially promising method of doing so is through the value‐at‐risk (VaR) method. VaR describes the worst loss a portfolio is likely to sustain, which will not be exceeded over a target time horizon at a given level of confidence. This article proposes a copula‐based GARCH model to estimate the joint multivariate distribution, which is a key component in VaR estimation. We show that the copula model can capture the VaR more successfully, as compared with the traditional method of calculation. As an empirical study, the expected portfolio VaR is examined when a shipowner chooses among Panamax soybean trading routes under a condition of reduced trade volumes between the United States and China due to the ongoing trade turmoil. This study serves as one of the very few papers in the literature on shipping portfolio VaR analysis. The results have significant implications for shipowners regarding fleet repositioning, decision making, and risk management. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
4. Distributional robustness and lateral transshipment for disaster relief logistics planning under demand ambiguity.
- Author
-
Wang, Duo, Yang, Kai, Yang, Lixing, and Li, Shukai
- Subjects
EMERGENCY management ,TRANSSHIPMENT ,DISTRIBUTION (Probability theory) ,ROBUST optimization ,STOCHASTIC programming - Abstract
This paper considers facility location, inventory pre‐positioning and vehicle routing as strategic and operational decisions corresponding to preparedness and response phases in disaster relief logistics planning. For balancing surpluses and shortages, an effective lateral transshipment strategy is proposed to evenly distribute the relief resources between warehouses after the disaster occurs. To handle ambiguity in the probability distribution of demand, we develop a risk‐averse two‐stage distributionally robust optimization (DRO) model for the disaster relief logistics planning problem, which specifies the worst‐case mean‐conditional value‐at‐risk (CVaR) as a risk measure. For computationally tractability, we transform the robust counterpart into its equivalent linear mixed‐integer programming model under the discrepancy‐based ambiguity set centered at the nominal (empirical) distributions on the observed demand from the historical data. We verify the effectiveness of the proposed DRO model and the value of lateral transshipment strategy by an illustrative small‐scale example. The numerical results show that the proposed DRO model has advantage on avoiding over‐conservative solutions compared to the classic robust optimization model. We also illustrate the applicability of the proposed DRO model by a real‐world case study of hurricanes in the southeastern United States. The computational results demonstrate that the proposed DRO model has superior out‐of‐sample performance and can mitigate the adverse effects of Optimizers' Curse compared with the traditional stochastic programming model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Non-parametric generalised newsvendor model.
- Author
-
Ghosh, Soham and Mukhoti, Sujay
- Subjects
NEWSVENDOR model ,DISTRIBUTION (Probability theory) ,PERISHABLE goods ,COST functions ,COVID-19 testing - Abstract
In the present paper we generalise the classical newsvendor problem for critical perishable commodities having more severe costs than its linear alternative. Piece wise polynomial cost functions are introduced to accommodate the excess severity. Stochastic demand is assumed to follow a completely unknown probability distribution. Non parametric estimator of the optimal order quantity has been developed from an estimating equation using a random sample. Strong consistency of the estimator is proved for unique optimal order quantity and the result is extended for multiple solutions. Simulation results indicate that non parametric estimator is efficient in terms of mean square error. Real life application of the proposed non-parametric estimator has been demonstrated with Avocado demand in the United States of America and Covid-19 test kit demand during second wave of SARS-COV2 pandemic across 86 countries. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
6. A theory for the relationship between lake surface area and maximum depth.
- Author
-
Cael, Brendan B. and Seekell, David
- Subjects
SURFACE area ,DISTRIBUTION (Probability theory) ,LAKES - Abstract
Maximum depth is crucial for many lake processes and biota, but attempts to explain its variation have achieved little predictive power. In this paper, we describe the probability distribution of maximum depths based on recent developments in the theory of fractal Brownian motions. The theoretical distribution is right‐tailed and adequately captures variations in maximum depth in a dataset of 8164 lakes (maximum depths 0.1–135 m) from the northeastern United States. Maximum depth increases with surface area, but with substantial random variation—the 95% prediction interval spans more than an order of magnitude for lakes with any specific surface area. Our results explain the observed variability in lake maximum depths, capture the link between topographic characteristics and lake bathymetry, and provide a means to upscale maximum depth‐dependent processes, which we illustrate by upscaling the diffusive flux of methane from northern lakes to the atmosphere. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
7. AN ANALYSIS OF AIR FORCE EOQ DATA WITH AN APPLICATION TO REORDER POINT CALCULATION.
- Author
-
Mitchell, C. R., Rappold, R. A., and Faulkner, W. B.
- Subjects
PRODUCTION management (Manufacturing) ,PROJECT management ,INVENTORY control ,LEAD time (Supply chain management) ,SUPPLY chain management ,POISSON processes ,DISTRIBUTION (Probability theory) ,PROBABILITY theory ,SUPPLY & demand ,MATHEMATICAL optimization - Abstract
One of the important uses of an EOQ item's distribution of lead time demand is to set its reorder point. This paper shows that a realistic model of observed demand patterns can be chosen from the compound Poisson family of distributions. Actual historical data from several U.S. Air Force bases are analyzed using the geometric-Poisson and constant-Poisson distributions. The control discipline is order quantity, reorder point with continuous review. The service level is based on percent of demand supplied during lead time and is consistent with current USAF methodology. The reorder point is based on independent calculations and no attempt is made to jointly optimize the order quantity and reorder point. Lead time is assumed to be known and constant. [ABSTRACT FROM AUTHOR]
- Published
- 1983
- Full Text
- View/download PDF
8. Analysis of human-factor-caused freight train accidents in the United States.
- Author
-
Zhang, Zhipeng, Turla, Tejashree, and Liu, Xiang
- Subjects
RAILROAD accidents ,RAILROAD trains ,TRENDS ,RAILROAD management ,DISTRIBUTION (Probability theory) ,AUTOMOBILES - Abstract
Human factors are major causes of train accidents in the United States. Understanding the safety risk of these accidents can provide insights into safety evaluation and improvement. This paper focuses on analyzing the train derailments and collisions due to human factors using 2000–2016 accident data on mainlines from the US Federal Railroad Administration. This research methodology involves three main sections. First, we analyze the statistical trend of annual accident rates by accident type and year. Based on the cause-specific distribution of accident frequency, the major causes are determined for each common accident type such as derailments and collisions. Next, we calculate accident severity (e.g., derailed cars, casualties) due to each specific human-factor accident cause. Finally, we compute annual accident risk and cause-specific accident risk using mean and alternative risk measures. The detailed accident data analysis approach herein can also be adapted to other types of train accidents, in support of decisions for rail safety improvement. The analysis of human-factor-caused train accidents can provide key information for the development and evaluation of potential safety improvement strategies. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
9. A marginal modelling approach for predicting wildfire extremes across the contiguous United States.
- Author
-
D'Arcy, Eleanor, Murphy-Barltrop, Callum J. R., Shooter, Rob, and Simpson, Emma S.
- Subjects
NEGATIVE binomial distribution ,MARGINAL distributions ,DISTRIBUTION (Probability theory) ,MISSING data (Statistics) ,EXTREME value theory ,WILDFIRES ,POISSON regression ,SELF-tuning controllers - Abstract
This paper details a methodology proposed for the EVA 2021 conference data challenge. The aim of this challenge was to predict the number and size of wildfires over the contiguous US between 1993 and 2015, with more importance placed on extreme events. In the data set provided, over 14% of both wildfire count and burnt area observations are missing; the objective of the data challenge was to estimate a range of marginal probabilities from the distribution functions of these missing observations. To enable this prediction, we make the assumption that the marginal distribution of a missing observation can be informed using non-missing data from neighbouring locations. In our method, we select spatial neighbourhoods for each missing observation and fit marginal models to non-missing observations in these regions. For the wildfire counts, we assume the compiled data sets follow a zero-inflated negative binomial distribution, while for burnt area values, we model the bulk and tail of each compiled data set using non-parametric and parametric techniques, respectively. Cross validation is used to select tuning parameters, and the resulting predictions are shown to significantly outperform the benchmark method proposed in the challenge outline. We conclude with a discussion of our modelling framework, and evaluate ways in which it could be extended. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
10. The variance of the discrete frequency transmission function of a reverberant room.
- Author
-
Davy, John L.
- Subjects
SOUND measurement ,RAYLEIGH model ,WIGNER distribution ,GAUSSIAN distribution ,DISTRIBUTION (Probability theory) - Abstract
This paper first shows experimentally that the distribution of modal spacings in a reverberation room is well modeled by the Rayleigh or Wigner distribution. Since the Rayleigh or Wigner distribution is a good approximation to the Gaussian orthogonal ensemble (GOE) distribution, this paper confirms the current wisdom that the GOE distribution is a good model for the distribution of modal spacings. Next this paper gives the technical arguments that the author used successfully to support the pragmatic arguments of Baade and the Air-conditioning and Refrigeration Institute of USA for retention of the pure tone qualification procedure and to modify a constant in the International Standard ISO 3741:1999(E) for measurement of sound power in a reverberation room. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
11. Climate and Landscape Controls of Regional Patterns of Flow Duration Curves Across the Continental United States: Statistical Approach.
- Author
-
Ghotbi, Saba, Wang, Dingbao, Singh, Arvind, Mayo, Talea, and Sivapalan, Murugesu
- Subjects
ENVIRONMENTAL engineering ,EPHEMERAL streams ,CURVES ,DISTRIBUTION (Probability theory) ,PARAMETER estimation ,LANDSCAPES - Abstract
The flow duration curve (FDC) is a hydrologically meaningful representation of the statistical distribution of daily streamflows. The complexity of processes contributing to the FDC introduces challenges for the direct exploration of physical controls on FDC. In this paper, the controls of climate and catchment characteristics on FDC are explored using a stochastic framework that enables construction of the FDC from three components of streamflow: fast and slow flow (during wet days) and slow flow during dry days. The FDC during wet days (FDCw) is computed as the statistical sum of the fast flow duration curve (FFDC) and the slow flow duration curve (SFDCw), considering their dependency. FDC is modeled as the mixture distribution of FDCw and the slow flow duration curve during dry days (SFDCd), by considering the fraction of wet days (δ) for perennial streams and both δ and the fraction of days of zero streamflow for ephemeral streams. The Kappa distribution is employed to fit the FFDC, SFDCw, and SFDCd for 300 catchments from Model Parameter Estimation Experiment (MOPEX) across the United States. Results show that the 0–20th percentile of FDC is controlled by FFDC and SFDCw, the 90–100th percentile of FDC is controlled by SFDCd, and the 20–90th percentile of FDC is controlled by three components. The relationships between estimated Kappa distribution parameters and climate and catchment characteristics reveal that the aridity index, the coefficient of variation of daily precipitation, timing of precipitation, time interval between storms, snow, topographic slope, and slope of recession slope curve are dominant controlling factors. Key Points: Flow duration curve (FDC) is modeled from three components: fast flow and slow flow during wet days and slow flow during dry daysThe duration curves for each component are modeled by Kappa distribution, and the climatic and landscape controls on FDCs are exploredAridity index, coefficient of variation of daily precipitation, slope, and recession slope are the dominant controls on FDCs [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
12. Managing Price Risk in a Multimarket Environment.
- Author
-
Min Liu and Wu, Felix F.
- Subjects
ELECTRIC industries ,ELECTRIC utilities ,ELECTRIC power consumption ,ELECTRICITY ,COMPETING risks ,UTILITY theory ,DISTRIBUTION (Probability theory) - Abstract
In a competitive electricity market, a generation company (Genco) can manage its trading risk through trading electricity among multiple markets such as spot markets and contract markets. The question is how to decide the trading proportion of each market in order to maximize the Genco's profit and minimize the associated risk. Based on the mean-variance portfolio theory, this paper proposes a sequential optimization approach to electric energy allocation between spot and contract markets, taking into consideration the risks of electricity price, congestion charge, and fuel price. Especially, the impact of the fuel market on electric energy allocation is analyzed and simulated with historical data in respect of the electricity market and other fuel markets in the U.S. Simulation results confirm that the proposed analytic approach is consistent with intuition and therefore reasonable and feasible for a Genco to make a trading plan involving risks in an electricity market. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
13. International comparisons of poverty intensity: Index decomposition and bootstrap inference.
- Author
-
Osberg, Lars and Kuan Xu
- Subjects
POVERTY ,WEALTH ,POOR people ,ESTIMATES ,STATISTICAL bootstrapping ,DISTRIBUTION (Probability theory) ,STATISTICAL sampling ,CONFIDENCE intervals - Abstract
This paper proposes an alternative formulation for the Sen-Shorrocks-Thon (SST) index of poverty intensity that is appropriate for survey data with sampling weights. It also decomposes the SST index into the poverty rate, the average poverty gap ratio among the poor, and the overall Gini index of poverty gap ratios. To account for sampling variation in estimates of poverty intensity, this paper uses the bootstrap method to compute confidence intervals and presents international comparisons using Luxembourg Income Study (LIS) data from the 1970s to the 1990s. Cross-sectional and longitudinal analyses indicate that the percentage change in poverty intensity can be approximated by the sum of percentage changes in the poverty rate and average poverty gap ratio, since changes in the overall Gini index of poverty gap ratios are negligible. In the early 1970s poverty intensity in Canada and the United States was almost indistinguishable, but in the 1970s Canadian poverty intensity decreased. Large increases in poverty intensity occurred in the 1980s in the United States, the United Kingdom, and Sweden. [ABSTRACT FROM AUTHOR]
- Published
- 2000
- Full Text
- View/download PDF
14. A Voyage of Discovery.
- Author
-
Billard, Lynne
- Subjects
STATISTICS ,ASSOCIATIONS, institutions, etc. ,PERIODICALS ,CENSUS ,ARITHMETIC mean ,VARIATIONAL principles ,DISTRIBUTION (Probability theory) ,STATISTICAL correlation - Abstract
This article highlights the historical events that took place within 50 years since the American Statistical Association was founded in 1839 and the Journal of the American Statistical Association (JASA) was first published in 1888. For the first years, JASA contained almost exclusively nonmathematical papers. Many were mere repositories of extensive data sets, including many compilations from census counts with interpretations of what these data purportedly revealed. Others were from investigations undertaken by sociologists, economists, political scientists and historians. One area that attracted theoretical attention dealt with the concepts of averages, variation and distributions. The second area that received theoretical attention during these years was correlation and related concepts.
- Published
- 1997
- Full Text
- View/download PDF
15. Nonparametric spatial models for extremes: application to extreme temperature data.
- Author
-
Fuentes, Montserrat, Henry, John, and Reich, Brian
- Subjects
NONPARAMETRIC statistics ,EXTRAPOLATION ,DISTRIBUTION (Probability theory) ,COPULA functions ,DIRICHLET forms - Abstract
Estimating the probability of extreme temperature events is difficult because of limited records across time and the need to extrapolate the distributions of these events, as opposed to just the mean, to locations where observations are not available. Another related issue is the need to characterize the uncertainty in the estimated probability of extreme events at different locations. Although the tools for statistical modeling of univariate extremes are well-developed, extending these tools to model spatial extreme data is an active area of research. In this paper, in order to make inference about spatial extreme events, we introduce a new nonparametric model for extremes. We present a Dirichlet-based copula model that is a flexible alternative to parametric copula models such as the normal and t-copula. The proposed modelling approach is fitted using a Bayesian framework that allow us to take into account different sources of uncertainty in the data and models. We apply our methods to annual maximum temperature values in the east-south-central United States. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
16. Maximum likelihood estimation of the double exponential jump-diffusion process.
- Author
-
Ramezani, Cyrus and Zeng, Yong
- Subjects
STOCK price indexes ,RATE of return ,DISTRIBUTION (Probability theory) ,STANDARD & Poor's 500 Index ,STOCK exchanges - Abstract
The double exponential jump-diffusion (DEJD) model, recently proposed by Kou (Manage Sci 48(8), 1086–1101, 2002) and Ramezani and Zeng (http://papers.ssrn.com/sol3/papers.cfm?abstract_id=606361, 1998), generates a highly skewed and leptokurtic distribution and is capable of matching key features of stock and index returns. Moreover, DEJD leads to tractable pricing formulas for exotic and path dependent options (Kou and Wang Manage Sci 50(9), 1178–1192, 2004). Accordingly, the double exponential representation has gained wide acceptance. However, estimation and empirical assessment of this model has received little attention to date. The primary objective of this paper is to fill this gap. We use daily returns for the S&P-500 and the NASDAQ indexes and individual stocks, in conjunction with maximum likelihood estimation (MLE) to fit the DEJD model. We utilize the BIC criterion to assess the performance of DEJD relative to log-normally distributed jump-diffusion (LJD) and the geometric brownian motion (GBM). We find that DEJD performs better than these alternatives for both indexes and individual stocks. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
17. An Extreme Value Approach to Estimating Interest-Rate Volatility: Pricing Implications for Interest-Rate Options.
- Author
-
Bali, Turan G.
- Subjects
EXTREME value theory ,DISTRIBUTION (Probability theory) ,INTEREST rates ,MARKET volatility ,MARKETS - Abstract
This paper proposes an extreme value approach to estimating interest-rate volatility, and shows that during the extreme movements of the U.S. Treasury market the volatility of interest-rate changes is underestimated by the standard approach that uses the thin-tailed normal distribution. The empirical results indicate that (1) the volatility of maximal and minimal changes in interest rates declines as time-to-maturity rises, yielding a downward-sloping volatility curve for the extremes; (2) the minimal changes are more volatile than the maximal changes for all data sets and for all asymptotic distributions used; (3) the minimal changes in Treasury yields have fatter tails than the maximal changes; and (4) for both the maxima and minima, the extreme changes in short-term rates have thicker tails than the extreme changes in long-term rates. This paper extends the standard option-pricing models with lognormal forward rates to accommodate significant kurtosis observed in the interest-rate data. This paper introduces a closed-form option-pricing model based on the generalized extreme value distribution that successfully removes the well-known pricing bias of the lognormal distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
18. Comparison of NASA Team2 and AES-York Ice Concentration Algorithms Against Operational Ice Charts From the Canadian Ice Service.
- Author
-
Shokr, Mohammed and Markus, Thorsten
- Subjects
AERONAUTICS ,ALGORITHMS ,DISTRIBUTION (Probability theory) ,ANALYSIS of variance ,GEOLOGY - Abstract
Ice concentration retrieved from spaceborne passive-microwave observations is a prime input to operational sea-ice-monitoring programs, numerical weather prediction models, and global climate models. Atmospheric Environment Service (AES)-York and the Enhanced National Aeronautics and Space Administration Team (NT2) are two algorithms that calculate ice concentration from Special Sensor Microwave/Imager observations. This paper furnishes a comparison between ice concentrations (total, thin, and thick types) output from NT2 and AES-York algorithms against the corresponding estimates from the operational analysis of Radarsat images in the Canadian Ice Service (CIS). A new data fusion technique, which incorporates the actual sensor's footprint, was developed to facilitate this study. Results have shown that the NT2 and AES-York algorithms underestimate total ice concentration by 18.35% and 9.66% concentration counts on average, with 16.8% and 15.35% standard deviation, respectively. However, the retrieved concentrations of thin and thick ice are in much more discrepancy with the operational CIS estimates when either one of these two types dominates the viewing area. This is more likely to occur when the total ice concentration approaches 100 %. If thin and thick ice types coexist in comparable concentrations, the algorithms' estimates agree with CIS's estimates. In terms of ice concentration retrieval, thin ice is more problematic than thick ice. The concept of using a single tie point to represent a thin ice surface is not realistic and provides the largest error source for retrieval accuracy. While AES-York provides total ice concentration in slightly more agreement with CIS's estimates, NT2 provides better agreement in retrieving thin and thick ice concentrations. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
19. Detection of temporal changes in the spatial distribution of cancer rates using local Moran’sIand geostatistically simulated spatial neutral models.
- Author
-
Goovaerts, Pierre and Jacquez, Geoffrey M.
- Subjects
CANCER ,MORTALITY ,GAUSSIAN processes ,DISTRIBUTION (Probability theory) - Abstract
This paper presents the first application of spatially correlated neutral models to the detection of changes in mortality rates across space and time using the local Moran’sIstatistic. Sequential Gaussian simulation is used to generate realizations of the spatial distribution of mortality rates under increasingly stringent conditions: 1) reproduction of the sample histogram, 2) reproduction of the pattern of spatial autocorrelation modeled from the data, 3) incorporation of regional background obtained by geostatistical smoothing of observed mortality rates, and 4) incorporation of smooth regional background observed at a prior time interval. The simulated neutral models are then processed using two new spatio-temporal variants of the Moran’sIstatistic, which allow one to identify significant changes in mortality ratesabove and beyondpast spatial patterns. Last, the results are displayed using an original classification of clusters/outliers tailored to the space-time nature of the data. Using this new methodology the space-time distribution of cervix cancer mortality rates recorded over all US State Economic Areas (SEA) is explored for 9 time periods of 5 years each. Incorporation of spatial autocorrelation leads to fewer significant SEA units than obtained under the traditional assumption of spatial independence, confirming earlier claims that Type I errors may increase when tests using the assumption of independence are applied to spatially correlated data. Integration of regional background into the neutral models yields substantially different spatial clusters and outliers, highlighting local patterns which were blurred when local Moran’sIwas applied under the null hypothesis of constant risk. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
20. Risk-averse two-stage distributionally robust optimisation for logistics planning in disaster relief management.
- Author
-
Wang, Duo, Yang, Kai, and Yang, Lixing
- Subjects
EMERGENCY management ,DISASTER relief ,LOCATION problems (Programming) ,DISTRIBUTION (Probability theory) ,ROBUST programming ,STOCHASTIC programming ,LOGISTICS - Abstract
Relief logistics is vital to disaster relief management. Herein, a risk-averse two-stage distributionally robust programming model is proposed to provide decision support for planning disaster relief logistics. It is distinct from the conventional disaster relief logistics planning problem in that (i) the facility location-inventory model and the multi-commodity network flow formulation are integrated; (ii) the probability distribution information of the supply, demand, and road link capacity is partially known, and (iii) the two-stage distributionally robust optimisation (DRO) method based on the worst-case mean-conditional value-at-risk criterion is developed. For tractability, we reformulate the proposed DRO model as equivalent mixed-integer linear programs for box and polyhedral ambiguity sets, which can be directly solved to optimality using the CPLEX software. To evaluate the validity of the proposed DRO model, we conduct numerical experiments based on a real-world case study addressing hurricane threats in the Gulf of Mexico region of the United States. Furthermore, we compare the performance of the proposed DRO model with that of the conventional two-stage stochastic programming model. Finally, we report the managerial implications and insights of using the risk-averse two-stage DRO approach for disaster relief management. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
21. Empirical measures of inflation uncertainty: a cautionary note.
- Author
-
Batchelor, Roy and Dua, Pami
- Subjects
MATHEMATICAL models of inflation ,PRICE inflation ,ECONOMIC forecasting ,FINANCE ,DISTRIBUTION (Probability theory) ,ANALYSIS of variance - Abstract
The article comments on inflation uncertainty in the U.S. The causes of changes in inflation uncertainty are not well understood, and theoretical predictions about their consequences are often ambiguous. The aim of this paper is to assess the quality of these proxies by comparing them with a direct measure of uncertainty, the root mean subjective variance of the probability distributions for future inflation constructed by respondents to the ASA/NBER surveys of U.S. economic forecasters. The authors start with a formal definition of inflation uncertainty, and what would constitute a rational expectation of uncertainty. The authors then introduce our benchmark measure of inflation uncertainty, and review alternative proxy measures. They compute these measures for the U.S. in the years 1968-89 and test whether there are statistically significant differences among them. Comparison of one proxy measure, the root mean square error of the inflation forecasts of the ASA/NBER forecasters, with their root mean subjective variance, constitutes a test of whether forecasters are rational in their assessments of inflation uncertainty.
- Published
- 1996
- Full Text
- View/download PDF
22. When and How Can You Specify a Probability Distribution When You Don't Know Much?
- Author
-
Haimes, Yacov Y., Barry, Timothy, and Lambert, James H.
- Subjects
ADULT education workshops ,DISTRIBUTION (Probability theory) ,PROBABILITY theory ,UNCERTAINTY - Abstract
This article highlights a workshop on specifying a probability distribution, organized by the U.S. Environmental Protection Agency (EPA) and the University of Virginia on April 18-20, 1993. A meeting was held to formulate the agenda for the workshop and to identify topical areas for white papers. During the meeting some 24 basic questions were generated for consideration during the workshop. Some of the questions belong to more than one group, and it was decided that some overlapping was advisable. On the basis of the recommendation made by the Steering Committee and prospective contributors, five white papers were commissioned. The contributors were asked to address the following topics: Relate the themes of white papers to the selection of probability distributions for Monte Carlo analysis in EPA problems; Generate straw man guidelines for the selection of probability distributions that will be useful across a broad scope of regulatory problems; Articulate and discuss the significance to the workshop of points of view that oppose or contradict those adopted in the papers; Relate the state of the art of uncertainty analysis to the discussion.
- Published
- 1994
- Full Text
- View/download PDF
23. THE STATISTICAL PROPERTIES OF THE BLACK-SCHOLES OPTION PRICE.
- Author
-
Ncube, Mthuli and Satchell, Stephen
- Subjects
FINANCE ,CONTRACTS ,FINANCIAL markets ,DISTRIBUTION (Probability theory) ,SECURITIES trading - Abstract
This paper investigates the statistical properties of the Black-Scholes option price, considered as a random variable. The option is conditioned on the current price and/or the estimated volatility of the underlying security. In both cases, some exact results for the distribution functions of the true option price and the predicted option price are derived. Extensions to puts and American contracts are considered. Numerical results are presented for option prices based on parameters appropriate for the ETSE 100 Index. [ABSTRACT FROM AUTHOR]
- Published
- 1997
- Full Text
- View/download PDF
24. A Stochastic Statistical Model for U.S. Outbreak-Level Tornado Occurrence Based on the Large-Scale Environment.
- Author
-
Malloy, Kelsey and Tippett, Michael K.
- Subjects
TORNADOES ,EL Nino ,STATISTICAL models ,DISTRIBUTION (Probability theory) ,OCEAN temperature ,STOCHASTIC models - Abstract
Tornado outbreaks—when multiple tornadoes occur within a short period of time—are rare yet impactful events. Here we developed a two-part stochastic tornado outbreak index for the contiguous United States (CONUS). The first component produces a probability map for outbreak tornado occurrence based on spatially resolved values of convective precipitation, storm relative helicity (SRH), and convective available potential energy. The second part of the index provides a probability distribution for the total number of tornadoes given the outbreak tornado probability map. Together these two components allow stochastic simulation of location and number of tornadoes that is consistent with environmental conditions. Storm report data from the Storm Prediction Center for the 1979–2021 period are used to train the model and evaluate its performance. In the first component, the probability of an outbreak-level tornado is most sensitive to SRH changes. In the second component, the total number of CONUS tornadoes depends on the sum and gridpoint maximum of the probability map. Overall, the tornado outbreak index represents the climatology, seasonal cycle, and interannual variability of tornado outbreak activity well, particularly over regions and seasons when tornado outbreaks occur most often. We found that El Niño–Southern Oscillation (ENSO) modulates the tornado outbreak index such that La Niña is associated with enhanced U.S. tornado outbreak activity over the Ohio River Valley and Tennessee River Valley regions during January–March, similar to the behavior seen in storm report data. We also found an upward trend in U.S. tornado outbreak activity during winter and spring for the 1979–2021 period using both observations and the index. Significance Statement: Tornado outbreaks are when multiple tornadoes happen in a short time span. Because of the rare, sporadic nature of tornadoes, it can be challenging to use observational tornado reports directly to assess how climate affects tornado and tornado outbreak activity. Here, we developed a statistical model that produces a U.S. map of the likelihood that an outbreak-level tornado would occur based on environmental conditions. In addition, using that likelihood map, the model predicts a range of how many tornadoes could occur in these events. We found that "storm relative helicity" (a proxy for potential rotation in a storm's updraft) is especially important for predicting outbreak tornado likelihood, and the sum and maximum value of the likelihood map is important for predicting total numbers for an event. Overall, this model can represent the typical behavior and fluctuations in tornado outbreak activity well. Both the tornado outbreak model and observations agree that the state of sea surface temperature in the tropical Pacific (El Niño–Southern Oscillation) is linked to tornado outbreak activity over the Ohio River Valley and Tennessee River Valley in winter through early spring and that there are upward trends in tornado outbreak activity. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Scenario-robust pre-disaster planning for multiple relief items.
- Author
-
Yang, Muer, Kumar, Sameer, Wang, Xinfang, and Fry, Michael J.
- Subjects
DISASTER relief ,STOCHASTIC programming ,EMERGENCY management ,DISTRIBUTION (Probability theory) ,SUPPLY chains - Abstract
The increasing vulnerability of the population from frequent disasters requires quick and effective responses to provide the required relief through effective humanitarian supply chain distribution networks. We develop scenario-robust optimization models for stocking multiple disaster relief items at strategic facility locations for disaster response. Our models improve the robustness of solutions by easing the difficult, and usually impossible, task of providing exact probability distributions for uncertain parameters in a stochastic programming model. Our models allow decision makers to specify uncertainty parameters (i.e., point and probability estimates) based on their degrees of knowledge, using distribution-free uncertainty sets in the form of ranges. The applicability of our generalized approach is illustrated via a case study of hurricane preparedness in the Southeastern United States. In addition, we conduct simulation studies to show the effectiveness of our approach when conditions deviate from the model assumptions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Examining Procedural Choice in the House Rules Committee.
- Author
-
Moffett, Kenneth W.
- Subjects
- *
INFORMATION theory , *PARTISANSHIP , *DISTRIBUTION (Probability theory) , *LEGISLATIVE bills - Abstract
Examines the distributive, informational and partisan theories that explain under which the U.S. House Rules Committee will place restrictive rules on bills. Application of the theories to the restriction procedures of the House Rules Committee; Review of literature on rule assignment; Hypotheses on Rules Committee rule assignment.
- Published
- 2004
- Full Text
- View/download PDF
27. Evaluation and Analysis of Remote Sensing-Based Approach for Salt Marsh Monitoring.
- Author
-
Richards IV, David F., Milewski, Adam M., Becker, Steffan, Donaldson, Yonesha, Davidson, Lea J., Zowam, Fabian J., Mrazek, Jay, and Durham, Michael
- Subjects
SALT marshes ,NORMALIZED difference vegetation index ,DISTRIBUTION (Probability theory) ,NATIONAL monuments - Abstract
In the United States (US), salt marshes are especially vulnerable to the effects of projected sea level rise, increased storm frequency, and climatic changes. Sentinel-2 data offer the opportunity to observe the land surface at high spatial resolutions (10 m). The Sentinel-2 data, encompassing Cumberland Island National Seashore, Fort Pulaski National Monument, and Canaveral National Seashore, were analyzed to identify temporal changes in salt marsh presence from 2016 to 2020. ENVI-derived unsupervised and supervised classification algorithms were applied to determine the most appropriate procedure to measure distant areas of salt marsh increases and decreases. The Normalized Difference Vegetation Index (NDVI) was applied to describe the varied vegetation biomass spatially. The results from this approach indicate that the ENVI-derived maximum likelihood classification provides a statistical distribution and calculation of the probability (>90%) that the given pixels represented both water and salt marsh environments. The salt marshes captured by the maximum likelihood classification indicated an overall decrease in salt marsh area presence. The NDVI results displayed how the varied vegetation biomass was analogous to the occurrence of salt marsh changes. Areas representing the lowest NDVI values (−0.1 to 0.1) corresponded to bare soil areas where a salt marsh decrease was detected. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Accounting for hydroclimatic properties in flood frequency analysis procedures.
- Author
-
Reinders, Joeri B. and Munoz, Samuel E.
- Subjects
KOPPEN climate classification ,DISTRIBUTION (Probability theory) ,FLOODS ,FLOOD risk ,ESTIMATES - Abstract
Flood hazard is typically evaluated by computing extreme flood probabilities from a flood frequency distribution following nationally defined procedures in which observed peak flow series are fit to a parametric probability distribution. These procedures, also known as flood frequency analysis, typically recommend only one probability distribution family for all watersheds within a country or region. However, large uncertainties associated with extreme flood probability estimates (>50 -year flood or Q50) can be further biased when fit to an inappropriate distribution model because of differences in the tails between distribution families. Here, we demonstrate that hydroclimatic parameters can aid in the selection of a parametric flood frequency distribution. We use L-moment diagrams to visually show the fit of gaged annual maxima series across the United States, grouped by their Köppen climate classification and the precipitation intensities of the basin, to a general extreme value (GEV), log normal 3 (LN3), and Pearson 3 (P3) distribution. Our results show that in real space basic hydroclimatic properties of a basin exert a significant influence on the statistical distribution of the annual maxima. The best-fitted family distribution shifts from a GEV towards an LN3 distribution across a gradient from colder and wetter climates (Köppen group D, continental climates) towards more arid climates (Köppen group B, dry climates). Due to the diversity of hydrologic processes and flood-generating mechanisms among watersheds within large countries like the United States, we recommend that the selection of distribution model be guided by the hydroclimatic properties of the basin rather than relying on a single national distribution model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. A generalized Grubbs-Beck test statistic for detecting multiple potentially influential low outliers in flood series.
- Author
-
Cohn, T. A., England, J. F., Berenbrock, C. E., Mason, R. R., Stedinger, J. R., and Lamontagne, J. R.
- Subjects
DISTRIBUTION (Probability theory) ,FLOODS ,RAINFALL anomalies ,FLOOD risk ,BAYESIAN analysis - Abstract
The Grubbs-Beck test is recommended by the federal guidelines for detection of low outliers in flood flow frequency computation in the United States. This paper presents a generalization of the Grubbs-Beck test for normal data (similar to the Rosner (1983) test; see also Spencer and McCuen (1996)) that can provide a consistent standard for identifying multiple potentially influential low flows. In cases where low outliers have been identified, they can be represented as 'less-than' values, and a frequency distribution can be developed using censored-data statistical techniques, such as the Expected Moments Algorithm. This approach can improve the fit of the right-hand tail of a frequency distribution and provide protection from lack-of-fit due to unimportant but potentially influential low flows (PILFs) in a flood series, thus making the flood frequency analysis procedure more robust. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
30. Distribution of Earthquake Cluster Sizes in the Western United States and in Japan.
- Author
-
Anderson, John G. and Nanjo, Kazuyoshi
- Subjects
EARTHQUAKE zones ,CLUSTER analysis (Statistics) ,SEISMOLOGY ,EARTHQUAKE swarms ,SEISMIC waves ,DISTRIBUTION (Probability theory) - Abstract
Earthquakes occur in clusters, which classically are described as foreshock-mainshock-aftershock sequences or swarms. In this paper, every earth-quake in a seismicity catalog is assigned to a cluster if it is separated from at least one other event in the cluster by less than At in time and less than Ar in space. The minimum cluster size is one earthquake. For catalogs that are complete to small magnitudes, this approach is successful in capturing the full spatial extent of an ex-tensive cluster even for Ar much smaller than the actual cluster dimension. The de-clustered catalogs are much closer to Poissonian distribution than the originals. This was applied to seismicity catalogs for Japan, Southern California, and Nevada. Cluster sizes measured by the number of earthquakes in the cluster exhibit an approximate power-law frequency distribution. An upper bound to cluster durations is proportional to K
0.5 , where K is the number of earthquakes in the cluster. This paper demonstrates an analytical approach suitable for selecting values of At and Ar that are appropriate for the earthquake catalog. [ABSTRACT FROM AUTHOR]- Published
- 2013
- Full Text
- View/download PDF
31. Scale depending variations of distribution and dynamic features of US Dollar/Georgian Lari exchange rate.
- Author
-
Matcharashvili, Teimuraz N., Jibladze, Nodar I., Iluridze, Tamar G., Matcharashvili, Tamar T., and Topchishvili, Alexander L.
- Subjects
FOREIGN exchange rates ,DEPENDENCE (Statistics) ,DISTRIBUTION (Probability theory) ,U.S. dollar ,DOMESTIC economic assistance - Abstract
The paper has been devoted to the investigation of statistical, distributional and dynamical features of exchange rate variation of Georgian currency (Georgial Lari - GEL) to the US Dollar within the last 15 years, since 1996 (when GEL was put into operation) upto the end of 2010. Dispersion characteristics, third and fourth statistical moments, probability distribution function of all existing currency exchange data sets as well as annual exchange rate time series have been calculated. Probability distribution fitting procedure has been used to fit observed exchange rate data to numerous hypothesized statistical models and the statistical goodness of fit methods were used to assess the quality of fitting and to identify the best candidate distribution for a given scenario. Scaling and dynamical features of exchange rate variation have been assessed using modern linear and nonlinear methods of time series analysis. For this, wavelet coefficients and detrended fluctuation exponents were calculated. Recurrence quantification analysis has been used to quantify extent of order in exchange rate variation. The same scaling and dynamical analysis has been carried out for the whole timeframe and for one year exchange rate data sets as well. Obtained results of exchange rate variation calculations have been tested by shuffled surrogate method. Time scale dependence of national currency exchange rate variation was confirmed for USD/GEL exchange rate data sets. Statistical, linear and nonlinear features of the exchange rate process were essentially changed for different observation time periods. Increased long range correlations and extent of order in the variation of USD/GEL exchange rate was found in time periods when Georgian economy was affected by the strong internal and international crises. This indicates the clear relationship between internal conditions and external influences with the Georgian domestic economic processes. According to the conducted analysis general features of statistical and dynamical changes in processes of small economy are quite similar notwithstanding are they caused by internal development, external influences or by their combination. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
32. Overflow Risk Analysis for Designing a Nonpoint Sources Control Detention.
- Author
-
Chi Hyun Choi, Seonju Cho, Moo Jong Park, and Sangdan Kim
- Subjects
RISK assessment ,PARAMETER estimation ,DISTRIBUTION (Probability theory) ,NATURE reserves ,RUNOFF - Abstract
This paper presents a design method by which the overflow risk related to a detention for managing nonpoint pollutant sources in urban areas can be evaluated. The overall overflow risk of a nonpoint pollutant sources control detention can be estimated by inherent overflow risk and operational overflow risk. For the purpose of calculating overflow risk, the 3-parameter mixed exponential distribution is applied to describe the probability distribution of rainfall event depth. As a rainfall-runoff calculation procedure required for deriving a rainfall capture curve, the U.S. Natural Resources Conservation Service runoff curve number method is applied to consider the nonlinearity of the rainfall-runoff relation. Finally, the detention overflow risk is assessed with respect to the detention design capacity and drainage time. The proposed overflow risk assessment is expected to provide a baseline to determine quantitative parameters in designing a nonpoint sources control detention. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
33. Las características educativas de los emigrantes mexicanos a Estados Unidos.
- Author
-
Mendoza, Alfredo Cuecuecha
- Subjects
EMIGRATION & immigration ,MEXICANS ,ECONOMETRICS ,STATISTICAL correlation ,DISTRIBUTION (Probability theory) - Abstract
Copyright of EconoQuantum is the property of Universidad de Guadalajara, Centro Universitario de Ciencias Economico Administrativas and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2010
- Full Text
- View/download PDF
34. L?VY-STABLE PRODUCTIVITY SHOCKS.
- Author
-
EDOARDO GAFFEO
- Subjects
GROWTH rate ,ECONOMIC sectors ,BUSINESS cycles ,TECHNOLOGY ,DISTRIBUTION (Probability theory) - Abstract
In this paper, we analyze the distribution of TFP growth rates at the four-digit sectoral level for the United States. We find that, contrary to the usual assumption employed in the literature on business cycles theory, technological shocks are not normally distributed. Instead, a L?vy-stable distribution with a divergent variance returns a better fit to the data. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
35. Identification of Search Models using Record Statistics.
- Author
-
BARLEVY, GADI
- Subjects
SOCIAL surveys ,WAGES ,MATHEMATICAL statistics ,DISTRIBUTION (Probability theory) ,YOUTH employment ,YOUTH ,SURVEYS - Abstract
This paper shows how record-value theory, a branch of statistics that deals with the timing and magnitude of extreme values in sequences of random variables, can be used to recover features of the wage offer distribution in conventional search models. Using National Longitudinal Survey of Youth (NLSY) wage data, I show that the data are not compatible with specifications for the offer distribution characterized by extreme negative skewness. In addition, I show that my approach can be used to construct a bound on the returns to job seniority. My results suggest that job seniority plays only a minor role in the wage growth of the workers surveyed in the NLSY. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
36. Effects of Tides on Maximum Tsunami Wave Heights: Probability Distributions.
- Author
-
Mofjeld, Harold O., González, Frank I., Titov, Vasily V., Venturato, Angie J., and Newman, Jean C.
- Subjects
DISTRIBUTION (Probability theory) ,OCEAN waves ,TSUNAMIS ,TIDES ,STATISTICAL correlation ,ANALYSIS of variance ,NUMERICAL analysis ,SCIENTIFIC experimentation - Abstract
A theoretical study was carried out to understand how the probability distribution for maximum wave heights (η
m ) during tsunamis depends on the initial tsunami amplitude (A) and the tides. It was assumed that the total wave height is the linear sum of the tides and tsunami time series in which the latter is decaying exponentially in amplitude with an e-folding time of 2.0 days, based on the behavior of observed Pacific-wide tsunamis. Direct computations were made to determine the statistics of maximum height for a suite of different arrival times and initial tsunami amplitudes. Using predicted tides for 1992 when the lunar nodal f factors were near unity during the present National Tidal Datum Epoch 1983–2001, the results show that when A is small compared with the tidal range the probability density function (PDF) of the difference ηm - A is closely confined in height near mean higher high water (MHHW). The ηm - A PDF spreads in height and its mean height ηo - A decreases, approaching the PDF of the tides and MSL, respectively, when A becomes large compared with the tidal range. A Gaussian form is found to be a close approximation to the ηm - A PDF over much of the amplitude range; associated parameters for 30 coastal stations along the U.S. West Coast, Alaska, and Hawaii are given in the paper. The formula should prove useful in probabilistic mapping of coastal tsunami flooding. [ABSTRACT FROM AUTHOR]- Published
- 2007
- Full Text
- View/download PDF
37. A retail sampling approach to assess impact of geographic concentrations on probative value of comparative bullet lead analysis.
- Author
-
Cole, Simon A., Tobin, William A., Boggess, Lyndsay N., and Stern, Hal S.
- Subjects
BULLETS ,LEAD ,DISTRIBUTION (Probability theory) ,DISPERSION (Chemistry) - Abstract
The probative value of comparative bullet lead analysis (CBLA), a now discontinued technique that was used by the Federal Bureau of Investigation for more than 30 years, has been hotly debated over the last several years. One issue that has received relatively little attention concerns the degree of geographic dispersion of bullets as they pass from manufacturers to retailers. Proponents and critics of CBLA alike agree that geographic distribution is such a major consideration, if not a predominant one, that it could significantly diminish, or completely erode, the probative value of a CBLA ‘match’ or, in some cases, even make a match counter-probative. The inattention to this issue to date appears to be a consequence of lack of data, rather than lack of importance. Until now, no datum concerning bullet distribution has been presented in the public domain, critically hampering the proper estimation of the probative value of a CBLA match. In this paper, we use manufacturer packing codes on boxes of bullets in retail outlets at four sites in the United States as a surrogate measure of bullet lead compositions to gauge local retail bullet distribution. Using a weighted average packing code match probability, we found very high degrees of geographic concentration of bullet packing codes. Although these findings can only offer a rough estimate of the degree of geographic concentration of actual chemical compositions of bullets, they are sufficient to establish that geographic concentration does, in fact, exist. Such a concentration would have a significant impact on the probative value of any claimed CBLA match. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
38. Monitoring spatial maxima.
- Author
-
Rogerson, Peter
- Subjects
CUSUM technique ,DISTRIBUTION (Probability theory) ,MAPS ,DEVIATION (Statistics) ,CANCER - Abstract
When assessing maps consisting of comparable regional values, it is of interest to know whether the peak, or maximum value, is higher than it would likely be by chance alone. Peaks on maps of crime or disease might be attributable to random fluctuation, or they might be due to an important deviation from the baseline process that produces the regional values. This paper addresses the situation where a series of such maps are observed over time, and it is of interest to detect statistically significant deviations between the observed and expected peaks as quickly as possible. The Gumbel distribution is used as a model for the statistical distribution of extreme values; this distribution does not require the underlying distributions of regional values to be either normal, known, or identical. Cumulative sum surveillance methods are used to monitor these Gumbel variates, and these methods are also extended for use when monitoring smoothed regional values (where the quantity monitored is a weighted sum of values in the immediate geographical neighborhood). The new methods are illustrated by using data on breast cancer mortality for the 217 counties of the northeastern United States, and prostate cancer mortality for the entire United States, during the period 1968-1998. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
39. A Method for Proxying a Respondent's Religious Background.
- Author
-
Hofrenning, Stella Koutroumanes and Chiswick, Barry R.
- Subjects
RELIGION ,ALGORITHMS ,PRIVATE schools ,PUBLIC schools ,DISTRIBUTION (Probability theory) ,PROBABILITY theory ,HOUSEHOLDS - Abstract
This paper develops an algorithm for the probability distribution of a respondent's religion in microdata (including the decennial census) in which there are data on ancestry but not on religion. A frequency distribution of religion by ancestry is generated from the General Social Survey and matched by ancestry groups in the U.S. decennial census. The fruitfulness of the procedure is demonstrated through an analysis of the effect of alternative measures of religion on the household's choice of public versus private schooling for children. This method is useful to any researcher wanting to distinguish religious affiliation when only ancestry data are available. [ABSTRACT FROM AUTHOR]
- Published
- 1999
- Full Text
- View/download PDF
40. An Operational Critique of Detection Laws.
- Author
-
Koopman, B. O.
- Subjects
DISTRIBUTION (Probability theory) ,MATHEMATICAL optimization ,MATHEMATICAL analysis ,SEARCH theory ,OPERATIONS research ,ALGEBRA - Abstract
This paper applies the test of operational meaningfulness to the many detection laws that have been formulated and used in the mathematical optimization of search, and given as generalizations of those first developed in the United States Navy during and immediately following World War II. The word "operational" is used both in the sense of practical O.R. and in that of P. W. Bridgman's well-known requirement: that in applying mathematics to the material world, the physical preconditions implied in the quantities used, the operations for measuring them, and their laws of consistency must be set forth explicitly. It is submitted that this criterion will orient O.R. work toward useful developments in the search theory and may serve to identify as such those theories that are of chief interest as developments in pure mathematics. Finally, we note that the requirement that every numerical assumption of probability distributions in a study of the outside world must be based on objective evidence is perfectly consistent with the "subjective" conception of probability, viz., that its meaning is apprehended intuitively--the former measures what the latter defines. [ABSTRACT FROM AUTHOR]
- Published
- 1979
- Full Text
- View/download PDF
41. An Alternative Approach to Dietary Exposure Assessment.
- Author
-
Harrison, Stanley L., Muenz, Larry R., Petersen, Barbara J., and Barraj, Leila M.
- Subjects
ENVIRONMENTAL risk assessment ,DISTRIBUTION (Probability theory) ,ENVIRONMENTAL protection - Abstract
The method of dietary exposure assessment currently used by the Environmental Protection Agency (EPA), the Dietary Residue Evaluation System (DRES), combines a consumption distribution derived from the United States Department of Agriculture (USDA) 1977-1978 Nationwide Food Consumption Survey (NFCS) with a single estimate of residue level. The National Academy of Sciences recommended that EPA incorporate both the distribution of residues and the distribution of consumption into their exposure assessment methodology and proposed using a Monte Carlo approach. This paper presents an alternative method, the Joint Distributional Analysis (JDA), that combines the consumption and residue distributions, without relying on random sampling or fitting theoretical distributions like the Monte Carlo method. This method permits simultaneous analysis of the entire diet, including assessing exposure from residues in different foods. [ABSTRACT FROM AUTHOR]
- Published
- 1994
42. SYSTEM RELIABILITY PREDICTION BASED ON HISTORICAL DATA.
- Author
-
Usher, John S. and Alexander, Suraj M.
- Subjects
STATISTICAL correlation ,WEIBULL distribution ,DISTRIBUTION (Probability theory) ,FACTORIZATION ,MATHEMATICAL models - Abstract
This paper describes the development and implementation of a computerized reliability prediction model at the IBM facility located in Research Triangle Park, North Carolina. Through the analysis of historical life-test data, the model provides maximum likelihood estimates of the assumed Weibull life distributions of various types of components. The resulting component life distribution estimates are used to predict the reliability of new system configurations. This approach is based upon the well- known theory of competing risks. Our model, however, is unique in that it allows for the analysis of a pooled set of life data, i.e. life data from different types of systems, to obtain component estimates. This feature greatly generalizes the competing risks framework and hence offers advantages over the more traditional approach. We present the model and discuss various issues that were found to be critical to its successful implementation at IBM. [ABSTRACT FROM AUTHOR]
- Published
- 1990
- Full Text
- View/download PDF
43. The Luria-Delbrück distribution.
- Author
-
Qi, Zheng
- Subjects
DISTRIBUTION (Probability theory) ,ESCHERICHIA coli ,GENETIC mutation ,BIOLOGICAL evolution - Abstract
The article presents an explanation of the origins of the Salvador Luria and Max Delbruck probability distribution and its role in studying evolutionary change of Escherichia coli in the U.S. The experiment of Luria is called the fluctuation test that estimates mutation rates and it is being introduced to biology students as an important biological experiment of the 20th century. The experiment of the two biologist received the Nobel Prize in Physiology or Medicine on December 10, 1969.
- Published
- 2010
- Full Text
- View/download PDF
44. Development of an Extreme Wind-Driven Rain Climatology for the Southeastern United States Using 1-Min Rainfall and Peak Wind Speed Data.
- Author
-
Belcher, Brian N., DeGaetano, Arthur T., Masters, Forrest J., Crandell, Jay, and Morrison, Murray J.
- Subjects
WIND speed ,CLIMATOLOGY ,DISTRIBUTION (Probability theory) ,INSURED losses ,RAINFALL ,ENGINEERING standards ,HURRICANES - Abstract
A method is presented to obtain the climatology of extreme wind speeds coincident with the occurrence of rain. The simultaneous occurrence of wind and rain can force water through building wall components such as windows, resulting in building damage and insured loss. To quantify this hazard, extreme value distributions are fit to peak 3-s wind speed data recorded during 1-min intervals with specific reported rain intensities. This improves upon previous attempts to quantify the wind-driven rain hazard that computed wind speed and rainfall-intensity probabilities independently and used hourly data that cannot assure the simultaneous occurrence of peak wind that represents only a several-second interval within the hour and rain that is accumulated over the entire hour. The method is applied across the southeastern United States, where the wind-driven rain hazard is most pronounced. For the lowest rainfall intensities, the computed wind speed extremes agree with published values that ignore rainfall occurrence. Such correspondence is desirable for aligning the rain-intensity-dependent wind speed return periods with established extreme wind statistics. Maximum 50-yr return-period wind speeds in conjunction with rainfall intensities ≥0.254 mm min−1 exceed 45 m s−1 in a swath from Oklahoma to the Gulf Coast and at stations along the immediate Atlantic coast. For rainfall intensities >2.54 mm min−1 maximum, 50-yr return-period wind speeds decrease to 35 m s−1 but occur over a similar area. The methodology is also applied to stations outside the Southeast to demonstrate its applicability for incorporating the wind-driven rain hazard in U.S. building standards. Significance Statement: Rainfall driven horizontally by strong winds can penetrate building components and cladding. If unmanaged, this can directly damage the building and its contents and become a substantial component of insured losses to buildings. A climatology of wind-driven rain is developed from recently available 1-min weather observations that better represent the joint occurrence of the extremes that define wind-driven rain occurrence than hourly data. This work is a first implementation of 1-min data into extreme-value statistical models, providing a basis for including wind-driven rain in United States building codes. This inclusion would be most significant in the hurricane-prone regions of the southeastern United States. The omission of wind-driven rain in U.S. building codes contrasts to its inclusion in Europe and Canada. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
45. Revisiting the multifractality in stock returns and its modeling implications.
- Author
-
He, Shanshan and Wang, Yudong
- Subjects
- *
RATE of return on stocks , *ECONOMIC models , *ECONOMIC impact analysis , *STOCK exchanges , *DISTRIBUTION (Probability theory) - Abstract
In this paper, we investigate the multifractality of Chinese and the U.S. stock markets using a multifractal detrending moving average algorithm. The results show that stock returns in both markets are multifractal at a similar extent. We detect the source of multifractality and find that long-range correlations are one of the major sources of multifractality in the US market but not in the Chinese market. Fat-tailed distribution plays a crucial role in multifractality of both markets. As an innovation, we quantify the effect of extreme events on multifractality and find the strong evidence of their contribution to multifractality. Furthermore, we investigate the usefulness of popular ARFIMA-GARCH models with skew-t distribution in capturing multifractality. Our results indicate that these models can capture only a fraction of multifractality. More complex models do not necessarily perform better than simple GARCH models in describing multifractality in stock returns. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
46. INTERVAL METHODS IN KNOWLEDGE REPRESENTATION.
- Author
-
Kreinovich, Vladik
- Subjects
INTERVAL analysis ,KNOWLEDGE representation (Information theory) ,STOCHASTIC analysis ,UNCERTAINTY (Information theory) ,STRUCTURAL engineering ,DISTRIBUTION (Probability theory) ,ALGORITHMS - Abstract
Please send your abstracts (or copies of papers that you want to see reviewed here) to vladik@utep.edu, or by regular mail to Vladik Kreinovich, Department of Computer Science, University of Texas at El Paso, El Paso, TX 79968, USA. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
47. Fluvial Flood Losses in the Contiguous United States Under Climate Change.
- Author
-
Rashid, M. M., Wahl, T., Villarini, G., and Sharma, A.
- Subjects
FLOOD damage ,GREENHOUSE gases ,FLOODS ,FLOOD risk ,CLIMATE change models ,DISTRIBUTION (Probability theory) ,PROPERTY damage - Abstract
Flooding is one of the most devastating natural disasters causing significant economic losses. One of the dominant drivers of flood losses is heavy precipitation, with other contributing factors such as built environments and socio‐economic conditions superimposed to it. To better understand the risk profile associated with this hazard, we develop probabilistic models to quantify the future likelihood of fluvial flood‐related property damage exceeding a critical threshold (i.e., high property damage) at the state level across the conterminous United States. The model is conditioned on indicators representing heavy precipitation amount and frequency derived from observed and downscaled precipitation. The likelihood of high property damage is estimated from the conditional probability distribution of annual total property damage, which is derived from the joint probability of the property damage and heavy precipitation indicators. Our results indicate an increase in the probability of high property damage (i.e., exceedance of 70th percentile of observed annual property damage for each state) in the future. Higher probability of high property damage is projected to be clustered in the states across the western and south‐western United States, and parts of the U.S. Northwest and the northern Rockies and Plains. Depending on the state, the mean annual probability of high property damage in these regions could range from 38% to 80% and from 46% to 95% at the end of the century (2090s) under RCP4.5 and RCP8.5 scenarios, respectively. This is equivalent to 20%–40% increase in the probability compared to the historical period 1996–2005. Results show that uncertainty in the projected probability of high property damage ranges from 14% to 35% across the states. The spatio‐temporal variability of the uncertainty across the states and three future decades (i.e., 2050s, 2070s, and 2090s) exhibits nonstationarity, which is driven by the uncertainty associated with the probabilistic prediction models and climate change scenarios. Plain Language Summary: Floods create significant economic losses in the United States and many other places across the world. Floods and flood‐related losses are expected to change due to changes in heavy precipitation in a warmer climate. Inferring how (including when and where) flood‐related losses could change in the future is crucial because of significant implications for flood risk management, insurance, and infrastructure resilience. We develop probabilistic models to project the likelihood of (fluvial) flood‐related high property damage (annual total property damage exceeding a critical threshold) conditioning on precipitation indicators under two greenhouse gas emission scenarios. We estimate relatively higher probability of high property damage for the states across the western and south‐western U.S. and parts of the U.S. Northwest and the northern Rockies and Plains, where projected changes range from 46% to 95% for a high‐emission scenario. In these regions, future changes in the probability of high property damage compared to the historical period vary from 20% to 40%. Overall, our results identify regions with higher likelihood of high property damage in the future, and they are useful for developing long‐term planning and resource mobilization, adaptation, and insurance instruments. Key Points: Future likelihood of flood‐related property damage is quantified using probabilistic models conditioned on precipitation indicatorsIncrease in the probability of flood‐related property damage is projected toward the mid and end of the century across the USNonstationary uncertainties in the projected flood‐related property damage originate from the probabilistic models and climate scenarios [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
48. Closure to “Probability Distribution of Low Streamflow Series in the United States” by Charles N. Kroll and Richard M. Vogel.
- Author
-
Kroll, Charles N. and Vogel, Richard M.
- Subjects
STREAMFLOW ,DISTRIBUTION (Probability theory) - Abstract
Clarifies issues raised concerning an article on the probability distribution of low streamflow series in the U.S. published in the 'Journal of Hydrological Engineering.' Choice of distribution; Issues of distributional fit addressed in the paper.
- Published
- 2003
- Full Text
- View/download PDF
49. Sensitivity to Madden-Julian Oscillation variations on heavy precipitation over the contiguous United States.
- Author
-
Jones, Charles and Carvalho, Leila M. V.
- Subjects
- *
SENSITIVITY analysis , *MADDEN-Julian oscillation , *PRECIPITATION forecasting , *CONTIGUOUS zones (Law of the sea) , *STORMS , *DISTRIBUTION (Probability theory) - Abstract
The Madden-Julian Oscillation (MJO) is the most prominent mode of tropical intraseasonal variability in the climate system and has worldwide influences on the occurrences and forecasts of heavy precipitation. This paper investigates the sensitivity of precipitation over the contiguous United States (CONUS) in a case study (boreal 2004-05 winter). Several major storms affected the western and eastern CONUS producing substantial economic and social impacts including loss of lives. The Weather Research and Forecasting (WRF) model is used to perform experiments to test the significance of the MJO amplitude. The control simulation uses the MJO amplitude observed by reanalysis, whereas the amplitude is modified in perturbation experiments. WRF realistically simulates the precipitation variability over the CONUS, although large biases occur over the Western and Midwest United States. Daily precipitation is aggregated in western, central and eastern sectors and the frequency distribution is analyzed. Increases in MJO amplitude produce moderate increases in the median and interquartile range and large and robust increases in extreme (90th and 95th percentiles) precipitation. The MJO amplitude clearly affects the transport of moisture from the tropical Pacific and Gulf of Mexico into North America providing moist rich air masses and the dynamical forcing that contributes to heavy precipitation. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
50. Extreme Financial cycles.
- Author
-
Candelon, Bertrand, Gaulier, Guillaume, and Hurlin, Christophe
- Subjects
INVESTORS ,FINANCE ,EXTREME value theory ,CALCULUS ,ECONOMIC policy ,DISTRIBUTION (Probability theory) ,BUSINESSPEOPLE - Abstract
Copyright of Revue d'Economie Politique is the property of Editions Dalloz Sirrey and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2012
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.