2,183 results
Search Results
2. A Note on a Paper by Wong and Heyde
- Author
-
Mikhail Urusov and Aleksandar Mijatović
- Subjects
Statistics and Probability ,Statistics::Theory ,Pure mathematics ,60G44, 60G48, 60H10, 60J60 ,General Mathematics ,Applied probability ,01 natural sciences ,FOS: Economics and business ,010104 statistics & probability ,60G48 ,FOS: Mathematics ,60G44 ,0101 mathematics ,60J60 ,Mathematics ,Local martingales versus true martingales ,010102 general mathematics ,Probability (math.PR) ,stochastic exponential ,Exponential function ,Mathematik ,60H10 ,Statistics, Probability and Uncertainty ,Martingale (probability theory) ,Quantitative Finance - General Finance ,General Finance (q-fin.GN) ,Mathematics - Probability ,Counterexample - Abstract
In this note we re-examine the analysis of the paper "On the martingale property of stochastic exponentials" by B. Wong and C.C. Heyde, Journal of Applied Probability, 41(3):654-664, 2004. Some counterexamples are presented and alternative formulations are discussed., Comment: To appear in Journal of Applied Probability, 11 pages
- Published
- 2011
3. Analysis of the early COVID-19 epidemic curve in Germany by regression models with change points
- Author
-
Andreas Bender, Michael Höhle, Helmut Küchenhoff, and Felix Günther
- Subjects
Male ,Coronavirus disease 2019 (COVID-19) ,Epidemiology ,Detailed data ,Discount points ,Poisson distribution ,01 natural sciences ,symbols.namesake ,010104 statistics & probability ,03 medical and health sciences ,Age Distribution ,0302 clinical medicine ,Germany ,Change point ,Humans ,Turning point ,030212 general & internal medicine ,0101 mathematics ,Aged ,Aged, 80 and over ,Original Paper ,Series (stratigraphy) ,SARS-CoV-2 ,COVID-19 ,Outbreak ,Robert koch institute ,Bayes Theorem ,Regression analysis ,Infectious Diseases ,Geography ,symbols ,Change points ,Regression Analysis ,Female ,sense organs ,Demography ,Federal state - Abstract
We analyze the Covid-19 epidemic curve from March to end of April 2020 in Germany. We use statistical models to estimate the number of cases with disease onset on a given day and use back-projection techniques to obtain the number of new infections per day. The respective time series are analyzed by a Poisson trend regression model with change points. The change points are estimated directly from the data without further assumptions. We carry out the analysis for the whole of Germany and the federal state of Bavaria, where we have more detailed data. Both analyses show a major change between March 9th and 13th for the time series of infections: from a strong increase to a stagnation or a slight decrease. Another change was found between March 24th and March 31st, where the decline intensified. These two major changes can be related to different governmental measures. On March, 11th, Chancellor Merkel appealed for social distancing in a press conference with the Robert Koch Institute (RKI) and a ban on major events with more than 1000 visitors (March 10th) was issued. The other change point at the end of March could be related to the shutdown in Germany. Our results differ from those by other authors as we take into account the reporting delay, which turned out to be time dependent and therefore changes the structure of the epidemic curve compared to the curve of newly reported cases.
- Published
- 2021
4. Nightlife clusters of coronavirus disease in Tokyo between March and April 2020
- Author
-
K. Kanda, M. Ujiie, Masahiro Ishikane, Saho Takaya, K. Nakamura, T. Suzuki, A. Kawashima, K. Yamamoto, T. Baba, Y. Akiyama, H. Nomoto, S. Hikida, T. Nakamoto, S. Morioka, Sho Saito, Norio Ohmagari, N. Kinoshita, Shinya Tsuzuki, T. Ito, Y. Miyazato, J. Tanuma, K. Ohara, A. Okuhama, S. Ide, Satoshi Kutsuna, and Kayoko Hayakawa
- Subjects
Adult ,Male ,2019-20 coronavirus outbreak ,Epidemiology ,Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) ,Pneumonia, Viral ,Disease ,medicine.disease_cause ,01 natural sciences ,010104 statistics & probability ,03 medical and health sciences ,Betacoronavirus ,0302 clinical medicine ,Emerging infections ,Pcr test ,medicine ,Humans ,Short Paper ,030212 general & internal medicine ,0101 mathematics ,Tokyo ,Pandemics ,Coronavirus ,Nightlife ,business.industry ,SARS-CoV-2 ,emerging infections ,infectious disease epidemiology ,Commerce ,COVID-19 ,Middle Aged ,Infectious Diseases ,Propensity score matching ,Female ,Human medicine ,business ,Coronavirus Infections ,Demography - Abstract
We analysed associations between exposure to nightlife businesses and severe acute respiratory syndrome coronavirus 2 PCR test results at a tertiary hospital in Tokyo between March and April 2020. A nightlife group was defined as those who had worked at or visited the businesses. We included 1517 individuals; 196 (12.9%) were categorised as the nightlife group. After propensity score matching, the proportion of positive PCR tests in the nightlife group was significantly higher than that in the non-nightlife group (nightlife, 63.8%; non-nightlife, 23.0%; P < 0.001). An inclusive approach to mitigate risks related to the businesses needs to be identified.
- Published
- 2020
- Full Text
- View/download PDF
5. Is there association between human development index and tuberculosis mortality risk? Evidence from a spatial analysis study in the south of Brazil
- Author
-
Ricardo Alexandre Arcêncio, B. M.A. Gabardo, Danielle Talita dos Santos, Aylana de Souza Belchior, Thaís Zamboni Berra, Ione Carvalho Pinto, A Queiroz, Luana Seles Alves, Marcela Paschoal Popolin, M. Yamamura, Pedro Fredemir Palha, Antônio Carlos Vieira Ramos, Luiz Henrique Arroyo, Carla Nunes, Marcos Augusto Moraes Arcoverde, Marina Jorge de Miranda, and Elma Mathias Dessunti
- Subjects
Risk ,Epidemiology ,Bayesian probability ,Growth ,Bivariate analysis ,01 natural sciences ,010104 statistics & probability ,03 medical and health sciences ,0302 clinical medicine ,Risk Factors ,Humans ,Tuberculosis ,030212 general & internal medicine ,Social determinants of health ,Human Development Index ,0101 mathematics ,Spatial analysis ,Spatial Analysis ,Original Paper ,Mortality rate ,Ecological study ,Bayes Theorem ,Infectious Diseases ,Geography ,Socioeconomic Factors ,Relative risk ,DESENVOLVIMENTO HUMANO ,Brazil ,Demography - Abstract
The goal of this study was to analyse the spatial pattern of tuberculosis (TB) mortality using different approaches, namely: mortality rates (MR), spatial relative risks (RR) and Bayesian rates (Global and Local) and their association with human development index (HDI), Global and its three dimensions: education, longevity and income. An ecological study was developed in Curitiba, Brazil based on data from Mortality Information System (2008–2014). Spatial scan statistics were used to compute RR and identify high-risk clusters. Bivariate Local Indicator of Spatial Associations was used to assess associations. MR ranged between 0 and 25.24/100.000 with a mean (standard deviation) of 1.07 (2.66). Corresponding values for spatial RR were 0–27.46, 1.2 (2.99) and for Bayesian rates (Global and Local) were 0.49–1.66, 0.90 (0.19) and 0–6.59, 0.98 (0.80). High-risk clusters were identified for all variables, except for HDI-income and Global Bayesian rate. Significant negative spatial relations were found between MR and income; between RR and HDI global, longevity and income; and Bayesian rates with all variables. Some areas presented different patterns: low social development/low risk and high risk/high development. These results demonstrate that social development variables should be considered, in mortality due TB.
- Published
- 2018
6. Different latent class models were used and evaluated for assessing the accuracy of campylobacter diagnostic tests: overcoming imperfect reference standards?
- Author
-
J Asselineau, P Perez, Emilie Bessède, Cécile Proust-Lima, A Paye, Bordeaux population health (BPH), and Université de Bordeaux (UB)-Institut de Santé Publique, d'Épidémiologie et de Développement (ISPED)-Institut National de la Santé et de la Recherche Médicale (INSERM)
- Subjects
sparseness ,Epidemiology ,Computer science ,biostatistics ,Enzyme-Linked Immunosorbent Assay ,Residual ,medicine.disease_cause ,Polymerase Chain Reaction ,Sensitivity and Specificity ,01 natural sciences ,Diagnosis, Differential ,Feces ,010104 statistics & probability ,03 medical and health sciences ,0302 clinical medicine ,Campylobacter Infections ,Statistics ,medicine ,Humans ,030212 general & internal medicine ,0101 mathematics ,Reference standards ,Parametric statistics ,Immunoassay ,Original Paper ,Models, Statistical ,Diagnostic Tests, Routine ,imperfect gold standard ,Campylobacter ,Reference Standards ,Class (biology) ,Latent class model ,3. Good health ,Infectious Diseases ,Conditional independence ,Latent Class Analysis ,USMR ,Gastrointestinal Infection ,[SDV.SPEE]Life Sciences [q-bio]/Santé publique et épidémiologie ,diagnostic accuracy ,France ,Imperfect ,latent class model ,Software - Abstract
In the absence of perfect reference standard, classical techniques result in biased diagnostic accuracy and prevalence estimates. By statistically defining the true disease status, latent class models (LCM) constitute a promising alternative. However, LCM is a complex method which relies on parametric assumptions, including usually a conditional independence between tests and might suffer from data sparseness. We carefully applied LCMs to assess new campylobacter infection detection tests for which bacteriological culture is an imperfect reference standard. Five diagnostic tests (culture, polymerase chain reaction and three immunoenzymatic tests) of campylobacter infection were collected in 623 patients from Bordeaux and Lyon Hospitals, France. Their diagnostic accuracy were estimated with standard and extended LCMs with a thorough examination of models goodness-of-fit. The model including a residual dependence specific to the immunoenzymatic tests best complied with LCM assumptions. Asymptotic results of goodness-of-fit statistics were substantially impaired by data sparseness and empirical distributions were preferred. Results confirmed moderate sensitivity of the culture and high performances of immunoenzymatic tests. LCMs can be used to estimate diagnostic tests accuracy in the absence of perfect reference standard. However, their implementation and assessment require specific attention due to data sparseness and limitations of existing software.
- Published
- 2018
7. Zero-inflated negative binomial mixed model: an application to two microbial organisms important in oesophagitis
- Author
-
Brandie D. Wagner, Sophie Fillon, Rui Fang, and J.K. Harris
- Subjects
0301 basic medicine ,Mixed model ,Haemophilus Infections ,Epidemiology ,Haemophilus ,Negative binomial distribution ,Biology ,01 natural sciences ,010104 statistics & probability ,03 medical and health sciences ,Statistics ,Esophagitis ,Humans ,0101 mathematics ,Models, Statistical ,Small number ,Confounding ,Multilevel model ,Fusobacterium ,Random effects model ,Original Papers ,030104 developmental biology ,Infectious Diseases ,Fusobacterium Infections ,Identification (biology) ,Zero inflated negative binomial - Abstract
SUMMARYAltered microbial communities are thought to play an important role in eosinophilic oesophagitis, an allergic inflammatory condition of the oesophagus. Identification of the majority of organisms present in human-associated microbial communities is feasible with the advent of high throughput sequencing technology. However, these data consist of non-negative, highly skewed sequence counts with a large proportion of zeros. In addition, hierarchical study designs are often performed with repeated measurements or multiple samples collected from the same subject, thus requiring approaches to account for within-subject variation, yet only a small number of microbiota studies have applied hierarchical regression models. In this paper, we describe and illustrate the use of a hierarchical regression-based approach to evaluate multiple factors for a small number of organisms individually. More specifically, the zero-inflated negative binomial mixed model with random effects in both the count and zero-inflated parts is applied to evaluate associations with disease state while adjusting for potential confounders for two organisms of interest from a study of human microbiota sequence data in oesophagitis.
- Published
- 2016
8. Spatio-temporal modelling of foot-and-mouth disease outbreaks
- Author
-
Z. Abas, Chrisovalantis Malesios, K. Dadousis, Polychronis Kostoulas, Theodoros Koutroumanidis, and Nikolaos Demiris
- Subjects
Swine ,040301 veterinary sciences ,Epidemiology ,Cattle Diseases ,Sheep Diseases ,Disease ,Environment ,Models, Biological ,01 natural sciences ,Disease Outbreaks ,0403 veterinary science ,010104 statistics & probability ,medicine ,Animals ,0101 mathematics ,Environmental noise ,Socioeconomics ,Swine Diseases ,Goat Diseases ,Sheep ,Greece ,Foot-and-mouth disease ,Goats ,Outbreak ,04 agricultural and veterinary sciences ,medicine.disease ,Original Papers ,Infectious Diseases ,Geography ,Foot-and-Mouth Disease Virus ,Multicollinearity ,Foot-and-Mouth Disease ,Cattle - Abstract
SUMMARYWe present and analyse data collected during a severe epidemic of foot-and-mouth disease (FMD) that occurred between July and September 2000 in a region of northeastern Greece with strategic importance since it represents the southeastern border of Europe and Asia. We implement generic Bayesian methodology, which offers flexibility in the ability to fit several realistically complex models that simultaneously capture the presence of ‘excess’ zeros, the spatio-temporal dependence of the cases, assesses the impact of environmental noise and controls for multicollinearity issues. Our findings suggest that the epidemic was mostly driven by the size and the animal type of each farm as well as the distance between farms while environmental and other endemic factors were not important during this outbreak. Analyses of this kind may prove useful to informing decisions related to optimal control measures for potential future FMD outbreaks as well as other acute epidemics such as FMD.
- Published
- 2016
9. Latent class regression models for simultaneously estimating test accuracy, true prevalence and risk factors for Brucella abortus
- Author
-
Amely Campe, D.A. Abernethy, Matthias Greiner, and Fraser Menzies
- Subjects
Male ,Pathology ,medicine.medical_specialty ,040301 veterinary sciences ,Epidemiology ,Brucella abortus ,Northern Ireland ,Biology ,Sensitivity and Specificity ,01 natural sciences ,Serology ,0403 veterinary science ,Brucellosis, Bovine ,010104 statistics & probability ,Predictive Value of Tests ,Risk Factors ,Statistics ,Covariate ,Prevalence ,medicine ,Animals ,Serologic Tests ,0101 mathematics ,Diagnostic Tests, Routine ,Sampling (statistics) ,Diagnostic test ,04 agricultural and veterinary sciences ,Gold standard (test) ,Original Papers ,Test (assessment) ,Cross-Sectional Studies ,Infectious Diseases ,Population study ,Cattle ,Female - Abstract
SUMMARYIn 2003/2004 a field trial was conducted in Northern Ireland to assess the diagnostic accuracy of six serological tests for bovine brucellosis caused by Brucella abortus. Whereas between-test comparisons have been used to calculate test performances so far, the present study used a latent class approach to estimate diagnostic test accuracy parameters in the absence of a gold standard for these six tests simultaneously and to estimate the true prevalence, while accounting for clustering in the study population and risk factors for true prevalence. Results obtained in this study with regard to prevalence, sensitivity and specificity were largely in accordance with previous findings. Screening tests (SAT and EDTA) appeared to be the most sensitive; however, at low prevalences the EDTA and CFT showed the highest positive predictive values of all investigated tests. The specificities and negative predictive values of all diagnostic tests were found to be very high. Differences of prevalence between three groups of the study population with different risk of exposure could be attributed to the mode of sampling indicating that a more risk-based sampling will result in a higher prevalence than a cross-sectional sampling mode. Age, dairy status and history of abortion were shown to influence the prediction of the latent true infection status.
- Published
- 2016
10. Variable selection and regression analysis for the prediction of mortality rates associated with foodborne diseases
- Author
-
S. R. Wild, L. A. Hanson, Dörte Döpfer, E. Amene, and E. A. Zahn
- Subjects
Adult ,Male ,Elastic net regularization ,Adolescent ,Epidemiology ,Bayesian probability ,Feature selection ,Biostatistics ,Global Health ,World Health Organization ,01 natural sciences ,Foodborne Diseases ,Young Adult ,010104 statistics & probability ,03 medical and health sciences ,0302 clinical medicine ,Pregnancy ,Statistics ,Animals ,Cluster Analysis ,Humans ,030212 general & internal medicine ,0101 mathematics ,Child ,Cluster analysis ,Survival analysis ,Aged ,Mathematics ,Aged, 80 and over ,Mortality rate ,Multilevel model ,Infant, Newborn ,Infant ,Regression analysis ,Middle Aged ,Survival Analysis ,Original Papers ,Infectious Diseases ,Child, Preschool ,Female ,Epidemiologic Methods - Abstract
SUMMARYThe purpose of this study was to apply a novel statistical method for variable selection and a model-based approach for filling data gaps in mortality rates associated with foodborne diseases using the WHO Vital Registration mortality dataset. Correlation analysis and elastic net regularization methods were applied to drop redundant variables and to select the most meaningful subset of predictors. Whenever predictor data were missing, multiple imputation was used to fill in plausible values. Cluster analysis was applied to identify similar groups of countries based on the values of the predictors. Finally, a Bayesian hierarchical regression model was fit to the final dataset for predicting mortality rates. From 113 potential predictors, 32 were retained after correlation analysis. Out of these 32 predictors, eight with non-zero coefficients were selected using the elastic net regularization method. Based on the values of these variables, four clusters of countries were identified. The uncertainty of predictions was large for countries within clusters lacking mortality rates, and it was low for a cluster that had mortality rate information. Our results demonstrated that, using Bayesian hierarchical regression models, a data-driven clustering of countries and a meaningful subset of predictors can be used to fill data gaps in foodborne disease mortality.
- Published
- 2016
11. On a paper by Doeblin on non-homogeneous Markov chains
- Author
-
Harry Cohn
- Subjects
Statistics and Probability ,Pure mathematics ,Class (set theory) ,Markov chain ,Applied Mathematics ,010102 general mathematics ,Structure (category theory) ,Mathematical proof ,01 natural sciences ,Set (abstract data type) ,010104 statistics & probability ,Chain (algebraic topology) ,Examples of Markov chains ,0101 mathematics ,Representation (mathematics) ,Mathematics - Abstract
In [5] Doeblin considered some classes of finite non-homogeneous Markov chains and gave without proofs several results concerning their asymptotic behaviour. In the present paper we first attempt to make Doeblin's results precise and try to reconstruct his arguments. Subsequently we investigate more general situations, where a state space decomposition is provided by the sets occurring in the representation of the atomic sets of the tail or-field. We show that Doeblin's notion of an associated chain, as well as considerations regarding the tail ar-field structure of the chain, can be used to solve such cases. FINITE MARKOV CHAIN; FINAL CLASS; CYCLICALLY MOVING SUBCLASS; TAIL ar-FIELD; ATOMIC SET; RECURRENCE; WHIRLPOOL
- Published
- 1981
12. Epidemics with carriers: A note on a paper of Dietz
- Author
-
F. Downton
- Subjects
Statistics and Probability ,Entire population ,education.field_of_study ,General Mathematics ,010102 general mathematics ,Population ,01 natural sciences ,Short interval ,010104 statistics & probability ,0101 mathematics ,Statistics, Probability and Uncertainty ,education ,Demography ,Mathematics - Abstract
In a recent paper Weiss (1965) has suggested a simple model for a carrier-borne epidemic such as typhoid. He considers a population (of size m) of susceptibles into which a number (k) of carriers is introduced. These carriers exhibit no overt symptoms and are only detectable by the discovery of infected persons. He supposed that after the initial introduction of the carriers, the population remains entirely closed and no new carriers arise. The epidemic then progresses until either all the carriers have been traced and isolated or until the entire population has succumbed to the disease.
- Published
- 1967
13. Some remarks on a paper of Kingman
- Author
-
R. K. Getoor
- Subjects
Discrete mathematics ,Statistics and Probability ,Zero set ,Subordinator ,Applied Mathematics ,010102 general mathematics ,Markov process ,Fixed point ,01 natural sciences ,symbols.namesake ,010104 statistics & probability ,Probability theory ,Joint probability distribution ,symbols ,State space ,0101 mathematics ,Finite set ,Mathematics - Abstract
We illustrate a technique for computing certain integrals that arise in probability theory by giving a new derivation of a formula of Kingman. This formula contains the joint distribution of the processes F(t) = inf {s: X(t + s) = b} and B(t) = inf{s: X(t - s) = b} where X is a time homogeneous, continuous parameter, Markov process and b is a fixed point in its state space. We then extend this formula to the situation in which b is replaced by a finite set {b 1, …, b n }.
- Published
- 1974
14. TREND EXTRACTION FROM ECONOMIC TIME SERIES WITH MISSING OBSERVATIONS BY GENERALIZED HODRICK–PRESCOTT FILTERS
- Author
-
Hiroshi Yamada
- Subjects
010104 statistics & probability ,Economics and Econometrics ,Trend extraction ,Series (mathematics) ,0502 economics and business ,05 social sciences ,Econometrics ,050207 economics ,0101 mathematics ,01 natural sciences ,Social Sciences (miscellaneous) ,Mathematics - Abstract
The Hodrick–Prescott (HP) filter has been a popular method of trend extraction from economic time series. However, it is impractical without modification if some observations are not available. This paper improves the HP filter so that it can be applied in such situations. More precisely, this paper introduces two alternative generalized HP filters that are applicable for this purpose. We provide their properties and a way of specifying those smoothing parameters that are required for their application. In addition, we numerically examine their performance. Finally, based on our analysis, we recommend one of them for applied studies.
- Published
- 2021
15. The modern rule of releases
- Author
-
Derek Whayman
- Subjects
0106 biological sciences ,Interpretation (philosophy) ,Compromise ,media_common.quotation_subject ,Equity (finance) ,Mistake ,01 natural sciences ,010601 ecology ,010104 statistics & probability ,Economics ,medicine ,0101 mathematics ,medicine.symptom ,Law ,Law and economics ,Confusion ,media_common - Abstract
This paper considers the history and nature of the ‘modern rule of releases’, concerning compromises to settle or preclude litigation. The rule holds that only matters the parties had contemplated as well as what they intended to release will in fact be released, even if the compromise has been made in the most general terms. Thus the rule is engaged when the releasor executes a general release but does not appreciate the existence of some of the claims the words used purport to release. This paper shows how the rule is a confusion of different conceptual bases and lines of authority and was created by accidentally muddling them together. It argues that, despite this, it successfully straddles both bases, functions well conceptually and serves a vital role.
- Published
- 2021
16. Extracting information from textual descriptions for actuarial applications
- Author
-
Kaixu Yang, Gee Y. Lee, Scott Manski, and Tapabrata Maiti
- Subjects
Statistics and Probability ,Economics and Econometrics ,business.industry ,05 social sciences ,computer.software_genre ,01 natural sciences ,010104 statistics & probability ,0502 economics and business ,Artificial intelligence ,0101 mathematics ,Statistics, Probability and Uncertainty ,business ,computer ,Natural language processing ,050205 econometrics ,Mathematics - Abstract
Initial insurance losses are often reported with a textual description of the claim. The claims manager must determine the adequate case reserve for each known claim. In this paper, we present a framework for predicting the amount of loss given a textual description of the claim using a large number of words found in the descriptions. Prior work has focused on classifying insurance claims based on keywords selected by a human expert, whereas in this paper the focus is on loss amount prediction with automatic word selection. In order to transform words into numeric vectors, we use word cosine similarities and word embedding matrices. When we consider all unique words found in the training dataset and impose a generalised additive model to the resulting explanatory variables, the resulting design matrix is high dimensional. For this reason, we use a group lasso penalty to reduce the number of coefficients in the model. The scalable, analytical framework proposed provides for a parsimonious and interpretable model. Finally, we discuss the implications of the analysis, including how the framework may be used by an insurance company and how the interpretation of the covariates can lead to significant policy change. The code can be found in the TAGAM R package (github.com/scottmanski/TAGAM).
- Published
- 2021
17. ESTIMATION OF TIME-VARYING COVARIANCE MATRICES FOR LARGE DATASETS
- Author
-
Yiannis Dendramis, Liudas Giraitis, and George Kapetanios
- Subjects
Estimation ,Economics and Econometrics ,05 social sciences ,Covariance ,Regularization (mathematics) ,Thresholding ,01 natural sciences ,Minimum variance portfolio ,010104 statistics & probability ,0502 economics and business ,Applied mathematics ,Statistics::Methodology ,0101 mathematics ,Algorithm ,Social Sciences (miscellaneous) ,Shrinkage ,050205 econometrics ,Mathematics - Abstract
Time variation is a fundamental problem in statistical and econometric analysis of macroeconomic and financial data. Recently, there has been considerable focus on developing econometric modelling that enables stochastic structural change in model parameters and on model estimation by Bayesian or nonparametric kernel methods. In the context of the estimation of covariance matrices of large dimensional panels, such data requires taking into account time variation, possible dependence and heavy-tailed distributions. In this paper, we introduce a nonparametric version of regularization techniques for sparse large covariance matrices, developed by Bickel and Levina (2008) and others. We focus on the robustness of such a procedure to time variation, dependence and heavy-tailedness of distributions. The paper includes a set of results on Bernstein type inequalities for dependent unbounded variables which are expected to be applicable in econometric analysis beyond estimation of large covariance matrices. We discuss the utility of the robust thresholding method, comparing it with other estimators in simulations and an empirical application on the design of minimum variance portfolios.
- Published
- 2021
18. LEAST SQUARES ESTIMATION FOR NONLINEAR REGRESSION MODELS WITH HETEROSCEDASTICITY
- Author
-
Qiying Wang
- Subjects
Statistics::Theory ,Economics and Econometrics ,Heteroscedasticity ,05 social sciences ,01 natural sciences ,010104 statistics & probability ,0502 economics and business ,Statistics ,Statistics::Methodology ,0101 mathematics ,Nonlinear regression ,Social Sciences (miscellaneous) ,050205 econometrics ,Mathematics - Abstract
This paper develops an asymptotic theory of nonlinear least squares estimation by establishing a new framework that can be easily applied to various nonlinear regression models with heteroscedasticity. As an illustration, we explore an application of the framework to nonlinear regression models with nonstationarity and heteroscedasticity. In addition to these main results, this paper provides a maximum inequality for a class of martingales, which is of interest in its own right.
- Published
- 2021
19. AN IMPROVEMENT OF MARKOVIAN INTEGRATION BY PARTS FORMULA AND APPLICATION TO SENSITIVITY COMPUTATION
- Author
-
Yue Liu, Zhiyan Shi, Ying Tang, Xincheng Zhu, and Jingjing Yao
- Subjects
Statistics and Probability ,Computation ,010102 general mathematics ,Markov process ,Management Science and Operations Research ,01 natural sciences ,Industrial and Manufacturing Engineering ,010104 statistics & probability ,symbols.namesake ,symbols ,Applied mathematics ,Integration by parts ,Sensitivity (control systems) ,0101 mathematics ,Statistics, Probability and Uncertainty ,Mathematics - Abstract
This paper establishes a new version of integration by parts formula of Markov chains for sensitivity computation, under much lower restrictions than the existing researches. Our approach is more fundamental and applicable without using Girsanov theorem or Malliavin calculus as did by past papers. Numerically, we apply this formula to compute sensitivity regarding the transition rate matrix and compare with a recent research by an IPA (infinitesimal perturbation analysis) method and other approaches.
- Published
- 2021
20. NOWCASTING ‘TRUE’ MONTHLY U.S. GDP DURING THE PANDEMIC
- Author
-
Stuart McIntyre, Aubrey Poon, James Mitchell, and Gary Koop
- Subjects
Observational error ,Stochastic volatility ,Nowcasting ,Computer science ,HB ,05 social sciences ,Bayesian probability ,Discount points ,01 natural sciences ,Gross domestic product ,Vector autoregression ,010104 statistics & probability ,Economic data ,0502 economics and business ,Outlier ,Pandemic ,Econometrics ,Economics ,050207 economics ,0101 mathematics ,General Economics, Econometrics and Finance - Abstract
Expenditure side and income side GDP are both measured at the quarterly frequency in the US and contain measurement error. They are noisy proxies of `true’ GDP. Several econometric methods exist for producing estimates of true GDP which reconcile these noisy estimates. Recently, the authors of this paper developed a mixed frequency reconciliation model which produces monthly estimates of true GDP. In the present paper, we investigate whether this model continues to work well in the face of the extreme observations that occurred during the pandemic year of 2020 and consider several extensions of it. These extensions include stochastic volatility and error distributions that are fat tailed or explicitly allow for outliers. We also investigate the performance of conditional forecasting, where we estimate our models using data through 2019 and then use these to nowcast throughout 2020. Nowcasts are updated each month of 2020 conditionally on the new data releases which occur each month, but the parameters are not re-estimated. In total we compare the real-time performance of 12 nowcasting approaches over the pandemic months. We find that our original model with Normal homoskedastic errors produces point nowcasts as good or better than any of the other approaches. A property of Normal homoskedastic models that is often considered bad (i.e. that they are not robust to outliers), actually benefits the KMMP model as it reacts confidently to the rapidly evolving economic data. In terms of nowcast densities, we find many of the extensions lead to larger predictive variances reflecting the great uncertainty of the pandemic months.
- Published
- 2021
21. On component failure in coherent systems with applications to maintenance strategies
- Author
-
Majid Asadi and Marzieh Hashemi
- Subjects
Statistics and Probability ,Independent and identically distributed random variables ,Mathematical optimization ,021103 operations research ,Corrective maintenance ,Applied Mathematics ,Computation ,0211 other engineering and technologies ,Optimal maintenance ,02 engineering and technology ,01 natural sciences ,Stochastic ordering ,Preventive maintenance ,Signature (logic) ,010104 statistics & probability ,Component (UML) ,0101 mathematics ,Mathematics - Abstract
Providing optimal strategies for maintaining technical systems in good working condition is an important goal in reliability engineering. The main aim of this paper is to propose some optimal maintenance policies for coherent systems based on some partial information about the status of components in the system. For this purpose, in the first part of the paper, we propose two criteria under which we compute the probability of the number of failed components in a coherent system with independent and identically distributed components. The first proposed criterion utilizes partial information about the status of the components with a single inspection of the system, and the second one uses partial information about the status of component failure under double monitoring of the system. In the computation of both criteria, we use the notion of the signature vector associated with the system. Some stochastic comparisons between two coherent systems have been made based on the proposed concepts. Then, by imposing some cost functions, we introduce new approaches to the optimal corrective and preventive maintenance of coherent systems. To illustrate the results, some examples are examined numerically and graphically.
- Published
- 2020
22. On a new stochastic model for cascading failures
- Author
-
Hyunju Lee
- Subjects
Statistics and Probability ,Stochastic modelling ,General Mathematics ,010102 general mathematics ,Residual ,01 natural sciences ,Stochastic ordering ,Cascading failure ,010104 statistics & probability ,Control theory ,Component (UML) ,Life test ,0101 mathematics ,Statistics, Probability and Uncertainty ,Mathematics - Abstract
In this paper, to model cascading failures, a new stochastic failure model is proposed. In a system subject to cascading failures, after each failure of the component, the remaining component suffers from increased load or stress. This results in shortened residual lifetimes of the remaining components. In this paper, to model this effect, the concept of the usual stochastic order is employed along with the accelerated life test model, and a new general class of stochastic failure models is generated.
- Published
- 2020
23. NONSTATIONARY LINEAR PROCESSES WITH INFINITE VARIANCE GARCH ERRORS
- Author
-
Rongmao Zhang and Ngai Hang Chan
- Subjects
Economics and Econometrics ,Stochastic process ,Autoregressive conditional heteroskedasticity ,05 social sciences ,Estimator ,Variance (accounting) ,Random walk ,01 natural sciences ,010104 statistics & probability ,Distribution (mathematics) ,0502 economics and business ,Ordinary least squares ,Applied mathematics ,Limit (mathematics) ,0101 mathematics ,Social Sciences (miscellaneous) ,050205 econometrics ,Mathematics - Abstract
Recently, Cavaliere, Georgiev, and Taylor (2018, Econometric Theory 34, 302–348) (CGT) considered the augmented Dickey–Fuller (ADF) test for a unit-root model with linear noise driven by i.i.d. infinite variance innovations and showed that ordinary least squares (OLS)-based ADF statistics have the same distribution as in Chan and Tran (1989, Econometric Theory 5, 354–362) for i.i.d. infinite variance noise. They also proposed an interesting question to extend their results to the case with infinite variance GARCH innovations as considered in Zhang, Sin, and Ling (2015, Stochastic Processes and their Applications 125, 482–512). This paper addresses this question. In particular, the limit distributions of the ADF for random walk models with short-memory linear noise driven by infinite variance GARCH innovations are studied. We show that when the tail index $\alpha , the limit distributions are completely different from that of CGT and the estimator of the parameters of the lag terms used in the ADF regression is not consistent. This paper provides a broad treatment of unit-root models with linear GARCH noises, which encompasses the commonly entertained unit-root IGARCH model as a special case.
- Published
- 2020
24. On moderate deviations in Poisson approximation
- Author
-
Qingwei Liu and Aihua Xia
- Subjects
Statistics and Probability ,Random graph ,Matching (graph theory) ,Distribution (number theory) ,General Mathematics ,Probability (math.PR) ,010102 general mathematics ,Poisson distribution ,01 natural sciences ,Birthday problem ,Normal distribution ,010104 statistics & probability ,symbols.namesake ,FOS: Mathematics ,Rare events ,symbols ,Applied mathematics ,Moderate deviations ,0101 mathematics ,Statistics, Probability and Uncertainty ,Primary 60F05, secondary 60E15 ,Mathematics - Probability ,Mathematics - Abstract
In this paper, we first use the distribution of the number of records to demonstrate that the right tail probabilities of counts of rare events are generally better approximated by the right tail probabilities of Poisson distribution than {those} of normal distribution. We then show the moderate deviations in Poisson approximation generally require an adjustment and, with suitable adjustment, we establish better error estimates of the moderate deviations in Poisson approximation than those in \cite{CFS}. Our estimates contain no unspecified constants and are easy to apply. We illustrate the use of the theorems in six applications: Poisson-binomial distribution, matching problem, occupancy problem, birthday problem, random graphs and 2-runs. The paper complements the works of \cite{CC92,BCC95,CFS}., 29 pages and 5 figures
- Published
- 2020
25. AI in actuarial science – a review of recent advances – part 1
- Author
-
Ronald Richman
- Subjects
Statistics and Probability ,Economics and Econometrics ,Actuarial science ,Artificial neural network ,business.industry ,Computer science ,Heuristic (computer science) ,Deep learning ,Supplementary appendix ,02 engineering and technology ,01 natural sciences ,010104 statistics & probability ,0202 electrical engineering, electronic engineering, information engineering ,Code (cryptography) ,020201 artificial intelligence & image processing ,Telematics ,Artificial intelligence ,0101 mathematics ,Statistics, Probability and Uncertainty ,business - Abstract
Rapid advances in artificial intelligence (AI) and machine learning are creating products and services with the potential not only to change the environment in which actuaries operate but also to provide new opportunities within actuarial science. These advances are based on a modern approach to designing, fitting and applying neural networks, generally referred to as “Deep Learning.” This paper investigates how actuarial science may adapt and evolve in the coming years to incorporate these new techniques and methodologies. Part 1 of this paper provides background on machine learning and deep learning, as well as an heuristic for where actuaries might benefit from applying these techniques. Part 2 of the paper then surveys emerging applications of AI in actuarial science, with examples from mortality modelling, claims reserving, non-life pricing and telematics. For some of the examples, code has been provided on GitHub so that the interested reader can experiment with these techniques for themselves. Part 2 concludes with an outlook on the potential for actuaries to integrate deep learning into their activities. Finally, a supplementary appendix discusses further resources providing more in-depth background on machine learning and deep learning.
- Published
- 2020
26. AI in actuarial science – a review of recent advances – part 2
- Author
-
Ronald Richman
- Subjects
Statistics and Probability ,Economics and Econometrics ,050208 finance ,Actuarial science ,Artificial neural network ,business.industry ,Computer science ,Heuristic (computer science) ,Deep learning ,05 social sciences ,Supplementary appendix ,01 natural sciences ,010104 statistics & probability ,0502 economics and business ,Code (cryptography) ,Telematics ,Artificial intelligence ,0101 mathematics ,Statistics, Probability and Uncertainty ,business - Abstract
Rapid advances in artificial intelligence (AI) and machine learning are creating products and services with the potential not only to change the environment in which actuaries operate, but also to provide new opportunities within actuarial science. These advances are based on a modern approach to designing, fitting and applying neural networks, generally referred to as “Deep Learning”. This paper investigates how actuarial science may adapt and evolve in the coming years to incorporate these new techniques and methodologies. Part 1 of this paper provides background on machine learning and deep learning, as well as an heuristic for where actuaries might benefit from applying these techniques. Part 2 of the paper then surveys emerging applications of AI in actuarial science, with examples from mortality modelling, claims reserving, non-life pricing and telematics. For some of the examples, code has been provided on GitHub so that the interested reader can experiment with these techniques for themselves. Part 2 concludes with an outlook on the potential for actuaries to integrate deep learning into their activities. Finally, a supplementary appendix discusses further resources providing more in-depth background on machine learning and deep learning.
- Published
- 2020
27. Samples with a limit shape, multivariate extremes, and risk
- Author
-
Natalia Nolde and Guus Balkema
- Subjects
Statistics and Probability ,Multivariate random variable ,Applied Mathematics ,010102 general mathematics ,Tail dependence ,Sample (statistics) ,01 natural sciences ,010104 statistics & probability ,Convergence of random variables ,Limit (mathematics) ,Statistical physics ,0101 mathematics ,Limit set ,Random variable ,Quantile ,Mathematics - Abstract
Large samples from a light-tailed distribution often have a well-defined shape. This paper examines the implications of the assumption that there is a limit shape. We show that the limit shape determines the upper quantiles for a large class of random variables. These variables may be described loosely as continuous homogeneous functionals of the underlying random vector. They play an important role in evaluating risk in a multivariate setting. The paper also looks at various coefficients of tail dependence and at the distribution of the scaled sample points for large samples. The paper assumes convergence in probability rather than almost sure convergence. This results in an elegant theory. In particular, there is a simple characterization of domains of attraction.
- Published
- 2020
28. ON THE COMPARISON OF PERFORMANCE-PER-COST FOR COHERENT AND MIXED SYSTEMS
- Author
-
Francisco J. Samaniego, Bo Henry Lindqvist, and Nana Wang
- Subjects
Statistics and Probability ,Independent and identically distributed random variables ,021103 operations research ,Computer science ,0211 other engineering and technologies ,Comparison results ,02 engineering and technology ,Management Science and Operations Research ,01 natural sciences ,Industrial and Manufacturing Engineering ,Reliability engineering ,010104 statistics & probability ,Mixed systems ,Component (UML) ,0101 mathematics ,Statistics, Probability and Uncertainty ,Naval research ,Reliability (statistics) ,Refining (metallurgy) - Abstract
The present paper is concerned with reliability economics, considering a certain performance-per-cost criterion for coherent and mixed systems, as introduced in [Dugas, M.R. & Samaniego, F.J. (2007). On optimal system designs in reliability-economics frameworks. Naval Research Logistics 54, 568–582]. We first present a new comparison result for performance-per-cost of systems with independent and identically distributed component lifetimes under certain stochastic orderings. We then consider optimization of the performance-per-cost criterion, first reconsidering and refining results from the above cited paper, and then considering mixtures of given subsets of coherent systems.
- Published
- 2020
29. RELIABILITY COMPARISON OF TWO UNIT REDUNDANCY SYSTEMS UNDER THE LOAD REQUIREMENT
- Author
-
Kyungmee O. Kim
- Subjects
Statistics and Probability ,021103 operations research ,Computer science ,Hot spare ,0211 other engineering and technologies ,Load sharing ,02 engineering and technology ,Management Science and Operations Research ,Design load ,Lifetime distribution ,01 natural sciences ,Industrial and Manufacturing Engineering ,Reliability engineering ,System requirements ,010104 statistics & probability ,Redundancy (engineering) ,Cold standby ,0101 mathematics ,Statistics, Probability and Uncertainty ,Weibull distribution - Abstract
This paper compares the reliability functions of the cold standby, hot standby, and load-sharing redundancy configurations, each of which is composed of two identical components for meeting a given system requirement. Thus far, no research has been done into the conditions that make one configuration more reliable than another because their reliability functions have no closed forms even when the component follows a Weibull lifetime distribution. In this paper, two analytical results are obtained given that the reliability of each configuration is expressed in terms of the design and operational loads of the component. First, higher reliability can be achieved in a cold standby configuration than in a load-sharing configuration if the increase in the component reliability obtained from the reduction in the operational load is not significant. Second, a cold standby configuration exhibits better reliability and carries a higher load than a hot standby configuration if the design load can be increased with a less decrease in the component reliability.
- Published
- 2020
30. MEAN–VARIANCE EQUILIBRIUM ASSET-LIABILITY MANAGEMENT STRATEGY WITH COINTEGRATED ASSETS
- Author
-
Mei Choi Chiu
- Subjects
050208 finance ,Cointegration ,05 social sciences ,Institutional investor ,Liability ,Financial market ,01 natural sciences ,Profit (economics) ,Microeconomics ,010104 statistics & probability ,Management strategy ,Mathematics (miscellaneous) ,0502 economics and business ,Economics ,Mean variance ,Dynamic inconsistency ,0101 mathematics - Abstract
This paper investigates asset-liability management problems in a continuous-time economy. When the financial market consists of cointegrated risky assets, institutional investors attempt to make profit from the cointegration feature on the one hand, while on the other hand they need to maintain a stable surplus level, that is, the company’s wealth less its liability. Challenges occur when the liability is random and cannot be fully financed or hedged through the financial market. For mean–variance investors, an additional concern is the rational time-consistency issue, which ensures that a decision made in the future will not be restricted by the current surplus level. By putting all these factors together, this paper derives a closed-form feedback equilibrium control for time-consistent mean–variance asset-liability management problems with cointegrated risky assets. The solution is built upon the Hamilton–Jacobi–Bellman framework addressing time inconsistency.
- Published
- 2020
31. KOLMOGOROV STORIES
- Author
-
Yu. K. Belyaev and Asaf H. Hajiyev
- Subjects
Statistics and Probability ,010104 statistics & probability ,021103 operations research ,0211 other engineering and technologies ,02 engineering and technology ,0101 mathematics ,Management Science and Operations Research ,Statistics, Probability and Uncertainty ,01 natural sciences ,Industrial and Manufacturing Engineering - Abstract
Authors of this paper were educated and have spent many years in Lomonosov Moscow State University (MSU), in the Interfaculty Laboratory of Statistical Methods (ILSM). Prof. Yu. K. Belyaev was student of Kolmogorov, and for several years his deputy in the ILSM, created and led by the great A. N. Kolmogorov. Now Prof. Yuri Belyaev is Emeritus Professor at Umea University (Sweden). Asaf Hajiyev was a PhD student in Kolmogorov's ILSM. In this paper some unusual incidents about the legendary Kolmogorov, and people around him are presented. Some stories were taken from the books and the internet (see references) dedicated to the memory of Kolmogorov; some were told by his students, and some happened with authors of this paper.Men are cruel but Man is kind.– A. N. Kolmogorov
- Published
- 2020
32. Martingale decomposition of an L2 space with nonlinear stochastic integrals
- Author
-
Clarence Simard
- Subjects
Statistics and Probability ,Optimization problem ,General Mathematics ,010102 general mathematics ,Stochastic calculus ,01 natural sciences ,010104 statistics & probability ,Nonlinear system ,Integrator ,Bounded function ,Applied mathematics ,0101 mathematics ,Statistics, Probability and Uncertainty ,Lp space ,Martingale (probability theory) ,Brownian motion ,Mathematics - Abstract
This paper generalizes the Kunita–Watanabe decomposition of an $L^2$ space. The generalization comes from using nonlinear stochastic integrals where the integrator is a family of continuous martingales bounded in $L^2$ . This result is also the solution of an optimization problem in $L^2$ . First, martingales are assumed to be stochastic integrals. Then, to get the general result, it is shown that the regularity of the family of martingales with respect to its spatial parameter is inherited by the integrands in the integral representation of the martingales. Finally, an example showing how the results of this paper, with the Clark–Ocone formula, can be applied to polynomial functions of Brownian integrals.
- Published
- 2019
33. FINDING NONSTATIONARY STATE PROBABILITIES OF OPEN MARKOV NETWORKS WITH MULTIPLE CLASSES OF CUSTOMERS AND VARIOUS FEATURES
- Author
-
Mikhail Matalytski and Dmitry Kopats
- Subjects
Statistics and Probability ,Power series ,Sequence ,Recurrence relation ,Exponential distribution ,Markov chain ,Series (mathematics) ,Computer science ,020206 networking & telecommunications ,02 engineering and technology ,Management Science and Operations Research ,01 natural sciences ,Industrial and Manufacturing Engineering ,010104 statistics & probability ,Convergence (routing) ,0202 electrical engineering, electronic engineering, information engineering ,Applied mathematics ,Radius of convergence ,0101 mathematics ,Statistics, Probability and Uncertainty - Abstract
This paper discusses a system of difference-differential equations (DDE) that is satisfied by the time-dependent state probabilities of open Markov queueing networks with various features. The number of network states in this case and the number of equations in this system is infinite. Flows of customers arriving at the network are a simple and independent, the time of customer services is exponentially distributed. The intensities of transitions between the network states are deterministic functions depending on its states.To solve the system of DDE, we propose a modified method of successive approximations, combined with the method of series. The convergence of successive approximations with time to a stationary probability distribution, the form of which is indicated in the paper has been proved. The sequence of approximations converges to a unique solution of the system of equations. Any successive approximation can be represented as a convergent power series with an infinite radius of convergence, the coefficients of which satisfy recurrence relations, which is convenient for calculations on a computer. Examples of the analysis of Markov G-networks with various features have been presented.
- Published
- 2019
34. Comparison results for M/G/1 queues with waiting and sojourn time deadlines
- Author
-
Yoshiaki Inoue
- Subjects
Statistics and Probability ,Waiting time ,Discrete mathematics ,021103 operations research ,Service time ,General Mathematics ,0211 other engineering and technologies ,Comparison results ,02 engineering and technology ,01 natural sciences ,010104 statistics & probability ,M/G/1 queue ,0101 mathematics ,Statistics, Probability and Uncertainty ,Queue ,Mathematics - Abstract
This paper considers two variants of M/G/1 queues with impatient customers, which are denoted by M/G/1+Gw and M/G/1+Gs. In the M/G/1+Gw queue customers have deadlines for their waiting times, and they leave the system immediately if their services do not start before the expiration of their deadlines. On the other hand, in the M/G/1+Gs queue customers have deadlines for their sojourn times, where customers in service also immediately leave the system when their deadlines expire. In this paper we derive comparison results for performance measures of these models. In particular, we show that if the service time distribution is new better than used in expectation, then the loss probability in the M/G/1+Gs queue is greater than that in the M/G/1+Gw queue.
- Published
- 2019
35. STOCHASTIC SETUP-COST INVENTORY MODEL WITH BACKORDERS AND QUASICONVEX COST FUNCTIONS
- Author
-
Yan Liang and Eugene A. Feinberg
- Subjects
Statistics and Probability ,Inventory control ,Relative value ,Mathematical optimization ,Sequence ,021103 operations research ,0211 other engineering and technologies ,02 engineering and technology ,Function (mathematics) ,Management Science and Operations Research ,Equicontinuity ,01 natural sciences ,Industrial and Manufacturing Engineering ,010104 statistics & probability ,Quasiconvex function ,Optimization and Control (math.OC) ,Convergence (routing) ,FOS: Mathematics ,Markov decision process ,0101 mathematics ,Statistics, Probability and Uncertainty ,Mathematics - Optimization and Control ,Mathematics - Abstract
This paper studies a periodic-review single-commodity setup-cost inventory model with backorders and holding/backlog costs satisfying quasiconvexity assumptions. We show that the Markov decision process for this inventory model satisfies the assumptions that lead to the validity of optimality equations for discounted and average-cost problems and to the existence of optimal (s,S) policies. In particular, we prove the equicontinuity of the family of discounted value functions and the convergence of optimal discounted lower thresholds to the optimal average-cost lower threshold for some sequence of discount factors converging to 1. If an arbitrary nonnegative amount of inventory can be ordered, we establish stronger convergence properties: (i) the optimal discounted lower thresholds converge to optimal average-cost lower threshold; and (ii) the discounted relative value functions converge to average-cost relative value function. These convergence results previously were known only for subsequences of discount factors even for problems with convex holding/backlog costs. The results of this paper also hold for problems with fixed lead times.
- Published
- 2019
36. CAPITAL ALLOCATION WITH MULTIVARIATE RISK MEASURES: AN AXIOMATIC APPROACH
- Author
-
Linxiao Wei and Yijun Hu
- Subjects
Statistics and Probability ,Multivariate statistics ,021103 operations research ,Risk measure ,0211 other engineering and technologies ,Univariate ,Axiomatic system ,02 engineering and technology ,Management Science and Operations Research ,01 natural sciences ,Industrial and Manufacturing Engineering ,Capital allocation line ,010104 statistics & probability ,Capital (economics) ,Subadditivity ,Economics ,Econometrics ,0101 mathematics ,Statistics, Probability and Uncertainty ,Project portfolio management - Abstract
Capital allocation is of central importance in portfolio management and risk-based performance measurement. Capital allocations for univariate risk measures have been extensively studied in the finance literature. In contrast to this situation, few papers dealt with capital allocations for multivariate risk measures. In this paper, we propose an axiom system for capital allocation with multivariate risk measures. We first recall the class of the positively homogeneous and subadditive multivariate risk measures, and provide the corresponding representation results. Then it is shown that for a given positively homogeneous and subadditive multivariate risk measure, there exists a capital allocation principle. Furthermore, the uniqueness of the capital allocation principe is characterized. Finally, examples are also given to derive the explicit capital allocation principles for the multivariate risk measures based on mean and standard deviation, including the multivariate mean-standard-deviation risk measures.
- Published
- 2019
37. THE GENERALIZED ENTROPY ERGODIC THEOREM FOR NONHOMOGENEOUS MARKOV CHAINS INDEXED BY A HOMOGENEOUS TREE
- Author
-
Huilin Huang
- Subjects
Statistics and Probability ,Pure mathematics ,Homogeneous tree ,Markov chain ,010102 general mathematics ,Management Science and Operations Research ,01 natural sciences ,Industrial and Manufacturing Engineering ,010104 statistics & probability ,Asymptotic equipartition property ,Law of large numbers ,Doob's martingale convergence theorems ,Ergodic theory ,0101 mathematics ,Statistics, Probability and Uncertainty ,Mathematics - Abstract
In this paper, we extend the strong laws of large numbers and entropy ergodic theorem for partial sums for tree-indexed nonhomogeneous Markov chains fields to delayed versions of nonhomogeneous Markov chains fields indexed by a homogeneous tree. At first we study a generalized strong limit theorem for nonhomogeneous Markov chains indexed by a homogeneous tree. Then we prove the generalized strong laws of large numbers and the generalized asymptotic equipartition property for delayed sums of finite nonhomogeneous Markov chains indexed by a homogeneous tree. As corollaries, we can get the similar results of some current literatures. In this paper, the problem settings may not allow to use Doob's martingale convergence theorem, and we overcome this difficulty by using Borel–Cantelli Lemma so that our proof technique also has some new elements compared with the reference Yang and Ye (2007).
- Published
- 2019
38. Context Matters: Measuring Nationalism in the Countries of the Former Czechoslovakia
- Author
-
Miloslav Bahna
- Subjects
Czech ,History ,media_common.quotation_subject ,05 social sciences ,Geography, Planning and Development ,Context (language use) ,01 natural sciences ,Protectionism ,language.human_language ,0506 political science ,Nationalism ,010104 statistics & probability ,General Social Survey ,Xenophobia ,Political science ,Political economy ,Political Science and International Relations ,National identity ,050602 political science & public administration ,language ,Slovak ,0101 mathematics ,media_common - Abstract
This paper compares nationalism in the two ex-Czechoslovak countries—the Czech and Slovak republics. The aim is to analyze the measurement of nationalism in the 1995, 2003, and 2013 International Social Survey Program (ISSP) National Identity surveys. According to the nationalism measures from the ISSP survey – which are frequently used by authors analyzing nationalism—both countries experienced a significant rise in nationalism in the 1995 to 2013 period. Moreover, invariance testing of the nationalism latent variable confirms the possibility of comparing levels of nationalism between Czechia and Slovakia over time. However, the associations between nationalism, as measured in the study, and concepts related to nationalism—such as xenophobia, protectionism, or assertive foreign policy—suggest that what is measured as nationalism in 1995 is very different from what is measured in 2013. This is explained by a change of context which occurred in both countries between 1995 and 2013. While answering the same question had a strong nationalistic connotation in 1995, this was not the case in 2013. Based on our findings we advise against using the analyzed “nationalism” items as measurement of nationalism even beyond the two analyzed countries.
- Published
- 2019
39. The study of variability in engineering design—An appreciation and a retrospective
- Author
-
Tim Davis
- Subjects
021103 operations research ,T1 ,Computer science ,Design activities ,0211 other engineering and technologies ,02 engineering and technology ,Variation (game tree) ,Variance (accounting) ,Parameter design ,TS ,01 natural sciences ,Industrial engineering ,010104 statistics & probability ,Robustness (computer science) ,Production (economics) ,TJ ,0101 mathematics ,Engineering design process - Abstract
We explore the concept of parameter design applied to the production of glass beads in the manufacture of metal-encapsulated transistors. The main motivation is to complete the analysis hinted at in the original publication by Jim Morrison in 1957, which was an early example of discussing the idea of transmitted variation in engineering design, and an influential paper in the development of analytic parameter design as a data-centric engineering activity. Parameter design is a secondary design activity focused on selecting the nominals of the design variables to achieve the required target performance and to simultaneously reduce the variance around the target. Although the 1957 paper is not recent, its approach to engineering design is modern.
- Published
- 2021
40. Actuarial valuations to monitor defined benefit pension funding
- Author
-
Christopher O'Brien
- Subjects
Statistics and Probability ,Economics and Econometrics ,Pension ,Solvency ,050208 finance ,Actuarial science ,05 social sciences ,Liability ,Pension regulation ,01 natural sciences ,010104 statistics & probability ,0502 economics and business ,Expected return ,Cash flow ,Business ,0101 mathematics ,Statistics, Probability and Uncertainty ,Credit risk ,Valuation (finance) - Abstract
This paper is motivated by The Pensions Regulator (TPR)’s review of its Code of Practice on funding for defined benefit schemes and aims to suggest how trustees and regulators should monitor the extent to which scheme assets are adequate to cover liabilities. It concludes that current practice is inadequate and needs to change. A review is carried out of papers on not only this subject but also (to collect ideas rather than automatically apply them to pensions solvency valuations) pensions and insurance accounting and regulation. Current practice is “scheme-specific funding” which permits discretion on choice of discount rates and other assumptions; the paper is concerned that this can lead to bias, and that trends in a scheme’s solvency can be obscured by changing assumptions. This also leads to the funding ratio communicated to scheme members having little meaning. The paper suggests that regulators should require a valuation that is based on sound principles, objective, fair, neutral, transparent and feasible. A prescribed methodology would replace discretion. It concludes that the benefits to be valued are those arising on discontinuance of the scheme, without allowing for future salary-related benefit increases, which are felt to no longer be a constructive obligation of employers. The valuation should, it is suggested, use market values of assets, which is largely current practice. Liabilities should reflect the trustees fulfilling their liabilities, rather than transferring them to an insurer (which may introduce artificialities). The discount rate should follow the “matching” approach, being a market-consistent risk-free rate: this is consistent with several papers to the profession in recent years. It avoids the problems of the “budgeting” approach, where the discount rate is based on the expected return on assets – this can be used to help set contribution levels but is not suitable for determining the value of liabilities, which depends on salary, service, longevity, etc and (very largely) not on the assets held. In principle, the liability value can be adjusted for illiquidity. Credit risk of the employer should not be allowed for. Liabilities should reflect the (probability-weighted) expected value of future cash flows and should not be increased by prudent margins or risk margins (which would lead to a non-neutral figure). Risk disclosures are needed to understand and manage risks. The resulting funding ratio is a consistent measure, to be disclosed to members, which can be used to manage the scheme, and by regulators as the basis for requiring action. Scheme-specific management using data such as the employer covenant means that immediate action to ensure 100% solvency on the proposed basis would not necessarily be appropriate. The author encourages the profession to advise TPR on the above lines.
- Published
- 2020
41. Trusted Smart Statistics: How new data will change official statistics
- Author
-
Fabio Ricciato, Martina Hahn, and Albrecht Wirthmann
- Subjects
Official statistics ,Data collection ,business.industry ,Emerging technologies ,Computer science ,media_common.quotation_subject ,05 social sciences ,Big data ,Digital data ,050801 communication & media studies ,General Medicine ,01 natural sciences ,Democracy ,Underdevelopment ,010104 statistics & probability ,0508 media and communications ,Paradigm shift ,Statistics ,0101 mathematics ,business ,media_common - Abstract
In this discussion paper, we outline the motivations and the main principles of the Trusted Smart Statistics (TSS) concept that is under development in the European Statistical System. TSS represents the evolution of official statistics in response to the challenges posed by the new datafied society. Taking stock from the availability of new digital data sources, new technologies, and new behaviors, statistical offices are called nowadays to rethink the way they operate in order to reassert their role in modern democratic society. The issue at stake is considerably broader and deeper than merely adapting existing processes to embrace so-called Big Data. In several aspects, such evolution entails a fundamental paradigm shift with respect to the legacy model of official statistics production based on traditional data sources, for example, in the relation between data and computation, between data collection and analysis, between methodological development and statistical production, and of course in the roles of the various stakeholders and their mutual relationships. Such complex evolution must be guided by a comprehensive system-level view based on clearly spelled design principles. In this paper, we aim at providing a general account of the TSS concept reflecting the current state of the discussion within the European Statistical System.
- Published
- 2020
42. TESTING REGRESSION MONOTONICITY IN ECONOMETRIC MODELS
- Author
-
Denis Chetverikov
- Subjects
FOS: Computer and information sciences ,Economics and Econometrics ,Smoothness ,Comparative statics ,05 social sciences ,Nonparametric statistics ,Mathematics - Statistics Theory ,Monotonic function ,Statistics Theory (math.ST) ,Statistics - Applications ,01 natural sciences ,Regression ,010104 statistics & probability ,Econometric model ,Consistency (statistics) ,0502 economics and business ,FOS: Mathematics ,Econometrics ,Applications (stat.AP) ,Economic model ,0101 mathematics ,Social Sciences (miscellaneous) ,050205 econometrics ,Mathematics - Abstract
Monotonicity is a key qualitative prediction of a wide array of economic models derived via robust comparative statics. It is therefore important to design eff ective and practical econometric methods for testing this prediction in empirical analysis. This paper develops a general nonparametric framework for testing monotonicity of a regression function. Using this framework, a broad class of new tests is introduced, which gives an empirical researcher a lot of flexibility to incorporate ex ante information she might have. The paper also develops new methods for simulating critical values, which are based on the combination of a bootstrap procedure and new selection algorithms. These methods yield tests that have correct asymptotic size and are asymptotically nonconservative. It is also shown how to obtain an adaptive rate optimal test that has the best attainable rate of uniform consistency against models whose regression function has Lipschitz-continuous fi rst-order derivatives and that automatically adapts to the unknown smoothness of the regression function. Simulations show that the power of the new tests in many cases signi ficantly exceeds that of some prior tests, e.g. that of Ghosal, Sen, and Van der Vaart (2000). An application of the developed procedures to the dataset of Ellison and Ellison (2011) shows that there is some evidence of strategic entry deterrence in pharmaceutical industry where incumbents may use strategic investment to prevent generic entries when their patents expire.
- Published
- 2018
43. ON THE DISTRIBUTION OF THE EXCEDENTS OF FUNDS WITH ASSETS AND LIABILITIES IN PRESENCE OF SOLVENCY AND RECOVERY REQUIREMENTS
- Author
-
Bernard Wong, Benjamin Avanzi, and Lars Henriksen
- Subjects
Economics and Econometrics ,Solvency ,Pension ,Actuarial science ,Present value ,Solvency ratio ,media_common.quotation_subject ,010102 general mathematics ,Liability ,Context (language use) ,Global assets under management ,Payment ,01 natural sciences ,010104 statistics & probability ,Accounting ,Economics ,Dividend ,Asset (economics) ,0101 mathematics ,Finance ,media_common - Abstract
In this paper, we consider a profitable, risky setting with two separate, correlated asset and liability processes (first introduced by Gerber and Shiu, 2003). The company that is considered is allowed to distribute excess profits (traditionally referred to as dividends in the literature), but is regulated and is subject to particular regulatory (solvency) constraints. Because of the bivariate nature of the surplus formulation, such distributions of excess profits can take two alternative forms. These can originate from a reduction of assets (and hence a payment to owners), but also from an increase of liabilities (when these represent the wealth of owners, such as in pension funds). The latter is particularly relevant if distributions of assets do not make sense because of the context, such as in regulated pension funds where assets are locked until retirement.In this paper, we extend the model of Gerber and Shiu (2003) and consider recovery requirements for the distribution of excess funds. Such recovery requirements are an extension of the plain vanilla solvency constraints considered in Paulsen (2003), and require funds to reach a higher level of funding than the solvency level (if and after it is triggered) before excess funds can be distributed again. We obtain closed form expressions for the expected present value of distributions (asset decrements or liability increments) when a distribution barrier is used. The optimal barrier level, as well as its existence and uniqueness are discussed.
- Published
- 2018
44. SET-VALUED CASH SUB-ADDITIVE RISK MEASURES
- Author
-
Fei Sun and Yijun Hu
- Subjects
Statistics and Probability ,Mathematical optimization ,021103 operations research ,Risk measure ,media_common.quotation_subject ,0211 other engineering and technologies ,Scalar (physics) ,02 engineering and technology ,Dual representation ,Extension (predicate logic) ,Management Science and Operations Research ,Characterization (mathematics) ,01 natural sciences ,Industrial and Manufacturing Engineering ,Set (abstract data type) ,010104 statistics & probability ,Time consistency ,Cash ,0101 mathematics ,Statistics, Probability and Uncertainty ,Mathematics ,media_common - Abstract
In this paper, we introduce a new class of set-valued risk measures which satisfies cash sub-additivity. Dual representation for them is provided. Moreover, we also investigate dynamic set-valued cash sub-additive risk measures and discuss the corresponding multi-portfolio time consistency. The equivalent characterization of the multi-portfolio time consistency is given. Finally, an example is also given to illustrate the introduction of set-valued cash sub-additive risk measures. The present paper can be considered as a set-valued extension of scalar cash sub-additive risk measures introduced by El Karouii and Ravanelli [8].
- Published
- 2018
45. ANALYSIS OF THE NETWORK WITH MULTIPLE CLASSES OF POSITIVE AND NEGATIVE CUSTOMERS AT A TRANSIENT REGIME
- Author
-
Mikhail Matalytski
- Subjects
Statistics and Probability ,Power series ,Sequence ,021103 operations research ,Recurrence relation ,Series (mathematics) ,Computer science ,0211 other engineering and technologies ,02 engineering and technology ,Management Science and Operations Research ,01 natural sciences ,Industrial and Manufacturing Engineering ,010104 statistics & probability ,Variable (computer science) ,Applied mathematics ,Shaping ,Radius of convergence ,G-network ,0101 mathematics ,Statistics, Probability and Uncertainty - Abstract
This paper is devoted to the investigation of the G-network with multiple classes of positive and negative customers. The purpose of the investigation is to analyze such a network at a transient regime, to find the state probabilities of the network that depend on time. In the first part, a description of the functioning of G-networks with positive and negative customers is provided, when a negative customer when arriving to the system destroys a positive customer of its class. Streams of positive and negative customers arriving at each of the network systems are independent. Services of positive customers of all types occur in accordance with a random selection of them for service. For nonstationary probabilities of network states, a system of Kolmogorov's difference-differential equations (DDE) has been derived. A method for their finding is proposed. It is based on the use of a modified method of successive approximations, combined with the method of series. It is proved that successive approximations converge with time to a stationary probability distribution, the form of which is indicated in this paper, and the sequence of approximations converges to the unique solution of the DDE system. Any successive approximation is representable in the form of a convergent power series with an infinite radius of convergence, the coefficients of which satisfy recurrence relations, which is convenient for computer calculations. A model example illustrating the determination of the time-dependent probabilities of network states using this technique has been calculated. The obtained results can be applied in modeling the behavior of computer viruses and attacks in information and telecommunication systems and networks, for example, as a model of the impact of some file viruses on server resources. variable.
- Published
- 2018
46. BOUNDS ON EXTROPY WITH VARIATIONAL DISTANCE CONSTRAINT
- Author
-
Wanwan Xia, Taizhong Hu, and Jianping Yang
- Subjects
Statistics and Probability ,Discrete mathematics ,021103 operations research ,0211 other engineering and technologies ,02 engineering and technology ,Management Science and Operations Research ,Mathematical proof ,01 natural sciences ,Upper and lower bounds ,Industrial and Manufacturing Engineering ,Confidence interval ,010104 statistics & probability ,Entropy (information theory) ,Probability distribution ,0101 mathematics ,Statistics, Probability and Uncertainty ,Majorization ,Mathematics - Abstract
The relation between extropy and variational distance is studied in this paper. We determine the distribution which attains the minimum or maximum extropy among these distributions within a given variation distance from any given probability distribution, obtain the tightest upper bound on the difference of extropies of any two probability distributions subject to the variational distance constraint, and establish an analytic formula for the confidence interval of an extropy. Such a study parallels to that of Ho and Yeung [3] concerning entropy. However, the proofs of the main results in this paper are different from those in Ho and Yeung [3]. In fact, our arguments can simplify several proofs in Ho and Yeung [3].
- Published
- 2018
47. On age difference in joint lifetime modelling with life insurance annuity applications
- Author
-
Youssouf A. F. Toukourou, Enkelejd Hashorva, Gildas Ratovomirija, and François Dufresne
- Subjects
Statistics and Probability ,Economics and Econometrics ,050208 finance ,Actuarial science ,Estimation theory ,media_common.quotation_subject ,05 social sciences ,Novelty ,01 natural sciences ,Copula (probability theory) ,Social group ,010104 statistics & probability ,Goodness of fit ,Joint probability distribution ,Life insurance ,0502 economics and business ,Economics ,Wife ,0101 mathematics ,Statistics, Probability and Uncertainty ,media_common - Abstract
Insurance and annuity products covering several lives require the modelling of the joint distribution of future lifetimes. In the interest of simplifying calculations, it is common in practice to assume that the future lifetimes among a group of people are independent. However, extensive research over the past decades suggests otherwise. In this paper, a copula approach is used to model the dependence between lifetimes within a married couple using data from a large Canadian insurance company. As a novelty, the age difference and the gender of the elder partner are introduced as an argument of the dependence parameter. Maximum likelihood techniques are thus implemented for the parameter estimation. Not only do the results make clear that the correlation decreases with age difference, but also the dependence between the lifetimes is higher when husband is older than wife. A goodness-of-fit procedure is applied in order to assess the validity of the model. Finally, considering several annuity products available on the life insurance market, the paper concludes with practical illustrations.
- Published
- 2018
48. An NDC approach to helping pensioners cope with the cost of long-term care
- Author
-
Carlos Vidal-Meliá, Javier Pla-Porcel, and Manuel Ventura-Marco
- Subjects
Attractiveness ,Organizational Behavior and Human Resource Management ,Economics and Econometrics ,Strategy and Management ,Pay as you go ,media_common.quotation_subject ,Overlapping generations model ,01 natural sciences ,Industrial and Manufacturing Engineering ,Social insurance ,010104 statistics & probability ,Economics ,050602 political science & public administration ,0101 mathematics ,Long-term care insurance ,Notional amount ,Function (engineering) ,media_common ,Pension ,Actuarial science ,Mechanical Engineering ,05 social sciences ,Metals and Alloys ,0506 political science ,Social security ,Long-term care ,Business ,Finance - Abstract
The aim of this paper is to analyse whether it would be possible to provide retirement and long-term care benefits using the same unfunded notional defined contribution scheme. We extend the multi-state overlapping generations model developed by Pla-Porcel et al. (2016) to include two new features: a long-term care benefit graded according to the annuitant's degree of disability and a minimum pension benefit for both contingencies. This brings the model closer to the reality of social insurance and enhances its political attractiveness. The paper contains a numerical example to show how the model functions and focuses especially on the mortality rates for dependent persons, the inception rates from a healthy state to (any) disability state, and the probabilities of transition between one health status and another. The numerical example proves that the model works reasonably well and makes it clear that it has practical implications that could be of interest to policy makers. It also provides us with some useful values regarding the impact of introducing a minimum pension on the system's financial equilibrium and reinforces the fact that biometric assumptions need to be estimated accurately before any decision is made to put the model into practice.
- Published
- 2018
49. THE STRONG LIMIT THEOREM FOR RELATIVE ENTROPY DENSITY RATES BETWEEN TWO ASYMPTOTICALLY CIRCULAR MARKOV CHAINS
- Author
-
Ying Tang, Yue Zhang, and Weiguo Yang
- Subjects
Statistics and Probability ,021103 operations research ,Kullback–Leibler divergence ,Markov chain ,Integrable system ,0211 other engineering and technologies ,02 engineering and technology ,Management Science and Operations Research ,01 natural sciences ,Industrial and Manufacturing Engineering ,010104 statistics & probability ,Law of large numbers ,Asymptotic equipartition property ,Limit (mathematics) ,Statistical physics ,0101 mathematics ,Statistics, Probability and Uncertainty ,Mathematics - Abstract
In this paper, we are going to study the strong limit theorem for the relative entropy density rates between two finite asymptotically circular Markov chains. Firstly, we prove some lammas on which the main result based. Then, we establish two strong limit theorem for non-homogeneous Markov chains. Finally, we obtain the main result of this paper. As corollaries, we get the strong limit theorem for the relative entropy density rates between two finite non-homogeneous Markov chains. We also prove that the relative entropy density rates between two finite non-homogeneous Markov chains are uniformly integrable under some conditions.
- Published
- 2018
50. An analysis of power law distributions and tipping points during the global financial crisis
- Author
-
Yifei Li, Lei Shi, John R. Evans, and Neil L. Allan
- Subjects
Statistics and Probability ,Economics and Econometrics ,050208 finance ,Financial economics ,Financial risk ,05 social sciences ,Tipping point (climatology) ,01 natural sciences ,Operational risk ,010104 statistics & probability ,symbols.namesake ,Market risk ,0502 economics and business ,Financial crisis ,symbols ,Economics ,Default ,Pareto distribution ,0101 mathematics ,Statistics, Probability and Uncertainty ,Credit risk - Abstract
Heavy-tailed distributions have been observed for various financial risks and papers have observed that these heavy-tailed distributions are power law distributions. The breakdown of a power law distribution is also seen as an indicator of a tipping point being reached and a system then moves from stability through instability to a new equilibrium. In this paper, we analyse the distribution of operational risk losses in US banks, credit defaults in US corporates and market risk events in the US during the global financial crisis (GFC). We conclude that market risk and credit risk do not follow a power law distribution, and even though operational risk follows a power law distribution, there is a better distribution fit for operational risk. We also conclude that whilst there is evidence that credit defaults and market risks did reach a tipping point, operational risk losses did not. We conclude that the government intervention in the banking system during the GFC was a possible cause of banks avoiding a tipping point.
- Published
- 2018
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.