280 results
Search Results
2. Statistical models for short- and long-term forecasts of snow depth.
- Author
-
Hammer, Hugo Lewi
- Subjects
STATISTICS ,MATHEMATICAL models ,STATISTICAL models ,SNOW ,METEOROLOGY - Abstract
Forecasting of future snow depths is useful for many applications like road safety, winter sport activities, avalanche risk assessment and hydrology. Motivated by the lack of statistical forecasts models for snow depth, in this paper we present a set of models to fill this gap. First, we present a model to do short-term forecasts when we assume that reliable weather forecasts of air temperature and precipitation are available. The covariates are included nonlinearly into the model following basic physical principles of snowfall, snow aging and melting. Due to the large set of observations with snow depth equal to zero, we use a zero-inflated gamma regression model, which is commonly used to similar applications like precipitation. We also do long-term forecasts of snow depth and much further than traditional weather forecasts for temperature and precipitation. The long-term forecasts are based on fitting models to historic time series of precipitation, temperature and snow depth. We fit the models to data from six locations in Norway with different climatic and vegetation properties. Forecasting five days into the future, the results showed that, given reliable weather forecasts of temperature and precipitation, the forecast errors in absolute value was between 3 and 7 cm for different locations in Norway. Forecasting three weeks into the future, the forecast errors were between 7 and 16 cm. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
3. Partially linear models and their applications to change point detection of chemical process data.
- Author
-
Ferreira, Clécio S., Zeller, Camila B., Mimura, Aparecida M. S., and Silva, Júlio C. J.
- Subjects
DATA analysis ,BEER-Lambert law ,LIGHT absorption ,LIGHT absorbance ,LINEAR statistical models ,MATHEMATICAL models - Abstract
In many chemical data sets, the amount of radiation absorbed (absorbance) is related to the concentration of the element in the sample by Lambert–Beer's law. However, this relation changes abruptly when the variable concentration reaches an unknown threshold level, the so-called change point. In the context of analytical chemistry, there are many methods that describe the relationship between absorbance and concentration, but none of them provide inferential procedures to detect change points. In this paper, we propose partially linear models with a change point separating the parametric and nonparametric components. The Schwarz information criterion is used to locate a change point. A back-fitting algorithm is presented to obtain parameter estimates and the penalized Fisher information matrix is obtained to calculate the standard errors of the parameter estimates. To examine the proposed method, we present a simulation study. Finally, we apply the method to data sets from the chemistry area. The partially linear models with a change point developed in this paper are useful supplements to other methods of absorbance–concentration analysis in chemical studies, for example, and in many other practical applications. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
4. Nonparametric predictive inference for stock returns.
- Author
-
Baker, Rebecca M., Coolen-Maturi, Tahani, and Coolen, Frank P. A.
- Subjects
MATHEMATICAL models ,RATE of return on stocks ,INFERENTIAL statistics ,PREDICTION models ,DATA distribution ,MATHEMATICAL variables - Abstract
In finance, inferences about future asset returns are typically quantified with the use of parametric distributions and single-valued probabilities. It is attractive to use less restrictive inferential methods, including nonparametric methods which do not require distributional assumptions about variables, and imprecise probability methods which generalize the classical concept of probability to set-valued quantities. Main attractions include the flexibility of the inferences to adapt to the available data and that the level of imprecision in inferences can reflect the amount of data on which these are based. This paper introduces nonparametric predictive inference (NPI) for stock returns. NPI is a statistical approach based on few assumptions, with inferences strongly based on data and with uncertainty quantified via lower and upper probabilities. NPI is presented for inference about future stock returns, as a measure for risk and uncertainty, and for pairwise comparison of two stocks based on their future aggregate returns. The proposed NPI methods are illustrated using historical stock market data. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
5. S-estimator in partially linear regression models.
- Author
-
Jiang, Yunlu
- Subjects
REGRESSION analysis ,DATA analysis ,MATHEMATICAL models ,ERROR analysis in mathematics ,MATHEMATICAL optimization - Abstract
In this paper, a robust estimator is proposed for partially linear regression models. We first estimate the nonparametric component using the penalized regression spline, then we construct an estimator of parametric component by using robust S-estimator. We propose an iterative algorithm to solve the proposed optimization problem, and introduce a robust generalized cross-validation to select the penalized parameter. Simulation studies and a real data analysis illustrate that the our proposed method is robust against outliers in the dataset or errors with heavy tails. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
6. Local influence diagnostics for generalized linear mixed models with overdispersion.
- Author
-
Rakhmawati, Trias Wahyuni, Molenberghs, Geert, Verbeke, Geert, and Faes, Christel
- Subjects
STATISTICAL models ,MATHEMATICAL models ,RANDOM effects model ,POISSON algebras ,POISSON distribution - Abstract
Since the seminal paper by Cook and Weisberg [9], local influence, next to case deletion, has gained popularity as a tool to detect influential subjects and measurements for a variety of statistical models. For the linear mixed model the approach leads to easily interpretable and computationally convenient expressions, not only highlighting influential subjects, but also which aspect of their profile leads to undue influence on the model's fit [17]. Ouwenset al.[24] applied the method to the Poisson-normal generalized linear mixed model (GLMM). Given the model's nonlinear structure, these authors did not derive interpretable components but rather focused on a graphical depiction of influence. In this paper, we consider GLMMs for binary, count, and time-to-event data, with the additional feature of accommodating overdispersion whenever necessary. For each situation, three approaches are considered, based on: (1) purely numerical derivations; (2) using a closed-form expression of the marginal likelihood function; and (3) using an integral representation of this likelihood. Unlike when case deletion is used, this leads to interpretable components, allowing not only to identify influential subjects, but also to study the cause thereof. The methodology is illustrated in case studies that range over the three data types mentioned. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
7. Critical values improvement for the standard normal homogeneity test by combining Monte Carlo and regression approaches.
- Author
-
Rienzner, Michele and Ieva, Francesca
- Subjects
MONTE Carlo method ,HOMOGENEITY ,REGRESSION analysis ,MATHEMATICAL models ,NUMERICAL analysis ,PROBABILITY theory - Abstract
The distribution of the test statistics of homogeneity tests is often unknown, requiring the estimation of the critical values through Monte Carlo (MC) simulations. The computation of the critical values at lowα, especially when the distribution of the statistics changes with the series length (sample cardinality), requires a considerable number of simulations to achieve a reasonable precision of the estimates (i.e. 106simulations or more for each series length). If, in addition, the test requires a noteworthy computational effort, the estimation of the critical values may need unacceptably long runtimes. To overcome the problem, the paper proposes a regression-based refinement of an initial MC estimate of the critical values, also allowing an approximation of the achieved improvement. Moreover, the paper presents an application of the method to two tests: SNHT (standard normal homogeneity test, widely used in climatology), and SNH2T (a version of SNHT showing a squared numerical complexity). For both, the paper reports the critical values forαranging between 0.1 and 0.0001 (useful for thep-value estimation), and the series length ranging from 10 (widely adopted size in climatological change-point detection literature) to 70,000 elements (nearly the length of a daily data time series 200 years long), estimated with coefficients of variation within 0.22%. For SNHT, a comparison of our results with approximated, theoretically derived, critical values is also performed; we suggest adopting those values for the series exceeding 70,000 elements. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
8. Holt-Winters Forecasting: An Alternative Formulation Applied to UK Air Passenger Data.
- Author
-
Bermúdez, J.D., Segura, J.V., and Vercher, E.
- Subjects
FORECASTING ,STATISTICAL smoothing ,TIME series analysis ,MONTE Carlo method ,PREDICTION models ,MATHEMATICAL models - Abstract
This paper provides a formulation for the additive Holt-Winters forecasting procedure that simplifies both obtaining maximum likelihood estimates of all unknowns, smoothing parameters and initial conditions, and the computation of point forecasts and reliable predictive intervals. The stochastic component of the model is introduced by means of additive, uncorrelated, homoscedastic and Normal errors, and then the joint distribution of the data vector, a multivariate Normal distribution, is obtained. In the case where a data transformation was used to improve the fit of the model, cumulative forecasts are obtained here using a Monte-Carlo approximation. This paper describes the method by applying it to the series of monthly total UK air passengers collected by the Civil Aviation Authority, a long time series from 1949 to the present day, and compares the resulting forecasts with those obtained in previous studies. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
9. Modelling uncertainty in insurance Bonus–Malus premium principles by using a Bayesian robustness approach.
- Author
-
Gómez-déniz, Emilio and Vázquez-polo, FranciscoJ.
- Subjects
INSURANCE premiums ,MATHEMATICS in insurance ,MATHEMATICAL models ,UNCERTAINTY ,BAYESIAN analysis ,ROBUST statistics - Abstract
When Bayesian models are implemented for a Bonus–Malus System (BMS), a parametric structure, π 0 (λ), is normally included in the insurer's portfolio. Following Bayesian sensitivity analysis, it is possible to model the structure function by specifying a class Γ of priors instead of a single prior. This paper examines the ranges of the relativities of the form, Standard and robust Bayesian tools are combined to show how the choice of the prior can affect the relative premiums. As an extension of the paper by Gómez et al. (2002b), our model is developed to the variance premium principle and the class of prior densities extended to ones that are more realistic in an actuarial setting, i.e. classes of generalized moments conditions. The proposed method is illustrated with data from Lemaire (1979). The main aim of the paper is to demonstrate an appropriate methodology to perform a Bayesian sensitivity analysis of the Bonus–Malus of loaded premiums. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
10. Multivariate Bayes Wavelet shrinkage and applications.
- Author
-
Huerta, Gabriel
- Subjects
WAVELETS (Mathematics) ,HARMONIC analysis (Mathematics) ,MATHEMATICAL analysis ,MONTE Carlo method ,MATHEMATICAL models ,MATHEMATICS - Abstract
In recent years, wavelet shrinkage has become a very appealing method for data de-noising and density function estimation. In particular, Bayesian modelling via hierarchical priors has introduced novel approaches for Wavelet analysis that had become very popular, and are very competitive with standard hard or soft thresholding rules. In this sense, this paper proposes a hierarchical prior that is elicited on the model parameters describing the wavelet coefficients after applying a Discrete Wavelet Transformation (DWT). In difference to other approaches, the prior proposes a multivariate Normal distribution with a covariance matrix that allows for correlations among Wavelet coefficients corresponding to the same level of detail. In addition, an extra scale parameter is incorporated that permits an additional shrinkage level over the coefficients. The posterior distribution for this shrinkage procedure is not available in closed form but it is easily sampled through Markov chain Monte Carlo (MCMC) methods. Applications on a set of test signals and two noisy signals are presented. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
11. A wavelet-based approach to the analysis and modelling of financial time series exhibiting strong long-range dependence: the case of Southeast Europe.
- Author
-
Bogdanova, Boryana and Ivanov, Ivan
- Subjects
FINANCIAL statistics ,WAVELETS (Mathematics) ,TIME series analysis ,DEPENDENCE (Statistics) ,MATHEMATICAL models ,RATE of return on stocks - Abstract
This paper demonstrates the utilization of wavelet-based tools for the analysis and prediction of financial time series exhibiting strong long-range dependence (LRD). Commonly emerging markets' stock returns are characterized by LRD. Therefore, we track the LRD evolvement for the return series of six Southeast European stock indices through the application of a wavelet-based semi-parametric method. We further engage the á trous wavelet transform in order to extract deeper knowledge on the returns term structure and utilize it for prediction purposes. In particular, a multiscale autoregressive (MAR) model is fitted and its out-of-sample forecast performance is benchmarked to that of ARMA. Additionally, a data-driven MAR feature selection procedure is outlined. We find that the wavelet-based method captures adequately LRD dynamics both in calm as well as in turmoil periods detecting the presence of transitional changes. At the same time, the MAR model handles with the complicated autocorrelation structure implied by the LRD in a parsimonious way achieving better performance. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
12. Degree course change and student performance: a mixed-effect model approach.
- Author
-
Adelfio, Giada and Boscaino, Giovanni
- Subjects
ACADEMIC achievement ,MATHEMATICAL models ,ACADEMIC degrees ,MULTILEVEL models ,COLLEGE credits ,ZERO-inflated probability distribution ,LONGITUDINAL method ,UNIVERSITIES & colleges - Abstract
This paper focuses on students credits earning speed over time and its determinants, dealing with the huge percentage of students who do not take the degree within the legal duration in the Italian University System. A new indicator for the performance of the student career is proposed on real data, concerning the cohort of students enrolled at a Faculty of the University of Palermo (followed for 7 years). The new indicator highlights a typical zero-inflated distribution and suggests to investigate the effect of the degree course (DC) change on the student career. A mixed-effect model for overdispersed data is considered, with the aim of taking into account the individual variability as well, due to the longitudinal nature of data. Results show the significant positive effect of the DC change on the student performance. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
13. Some notes on robust sure independence screening.
- Author
-
Mu, Weiyan and Xiong, Shifeng
- Subjects
LINEAR statistical models ,FAILURE time data analysis ,MATHEMATICAL models ,NONLINEAR statistical models ,SIMULATION methods & models - Abstract
Sure independence screening (SIS) proposed by Fan and Lv [4] uses marginal correlations to select important variables, and has proven to be an efficient method for ultrahigh-dimensional linear models. This paper provides two robust versions of SIS against outliers. The two methods, respectively, replace the sample correlation in SIS with two robust measures, and screen variables by ranking them. Like SIS, the proposed methods are simple and fast. In addition, they are highly robust against a substantial fraction of outliers in the data. These features make them applicable to large datasets which may contain outliers. Simulation results are presented to show their effectiveness. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
14. Dummy variables vs. category-wise models.
- Author
-
Holgersson, H.E.T., Nordström, L., and Öner, Ö.
- Subjects
EMPIRICAL research ,REGRESSION analysis ,MONTE Carlo method ,MATHEMATICAL variables ,MATHEMATICAL models - Abstract
Empirical research frequently involves regression analysis with binary categorical variables, which are traditionally handled through dummy explanatory variables. This paper argues that separate category-wise models may provide a more logical and comprehensive tool for analysing data with binary categories. Exploring different aspects of both methods, we contrast the two with a Monte Carlo simulation and an empirical example to provide a practical insight. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
15. The Rayleigh-Lindley model: properties and applications.
- Author
-
Gómez, Yolanda M., Gallardo, Diego I., Iriarte, Yuri, and Bolfarine, Heleno
- Subjects
PROBABILITY theory ,ALGORITHMS ,MATHEMATICS ,ALGEBRA ,MATHEMATICAL models - Abstract
In this paper, the Rayleigh-Lindley (RL) distribution is introduced, obtained by compounding the Rayleigh and Lindley discrete distributions, where the compounding procedure follows an approach similar to the one previously studied by Adamidis and Loukas in some other contexts. The resulting distribution is a two-parameter model, which is competitive with other parsimonious models such as the gamma and Weibull distributions. We study some properties of this new model such as the moments and the mean residual life. The estimation was approached via EM algorithm. The behavior of these estimators was studied in finite samples through a simulation study. Finally, we report two real data illustrations in order to show the performance of the proposed model versus other common two-parameter models in the literature. The main conclusion is that the model proposed can be a valid alternative to other competing models well established in the literature. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
16. Bayesian variable selection and coefficient estimation in heteroscedastic linear regression model.
- Author
-
Alshaybawee, Taha, Alhamzawi, Rahim, Midi, Habshah, and Allyas, Intisar Ibrahim
- Subjects
BAYESIAN analysis ,REGRESSION analysis ,ECONOMETRICS ,HETEROSCEDASTICITY ,LIFE sciences ,MATHEMATICAL models - Abstract
In many real applications, such as econometrics, biological sciences, radio-immunoassay, finance, and medicine, the usual assumption of constant error variance may be unrealistic. Ignoring heteroscedasticity (non-constant error variance), if it is present in the data, may lead to incorrect inferences and inefficient estimation. In this paper, a simple and effcient Gibbs sampling algorithm is proposed, based on a heteroscedastic linear regression model with an
penalty. Then, a Bayesian stochastic search variable selection method is proposed for subset selection. Simulations and real data examples are used to compare the performance of the proposed methods with other existing methods. The results indicate that the proposal performs well in the simulations and real data examples. R code is available upon request. [ABSTRACT FROM AUTHOR] - Published
- 2018
- Full Text
- View/download PDF
17. Skew-mixed effects model for multivariate longitudinal data with categorical outcomes and missingness.
- Author
-
Eftekhari Mahabadi, S. and Rahimi Jafari, E.
- Subjects
PROBABILITY theory ,MISSING data (Statistics) ,SEARCH algorithms ,NUMERICAL analysis ,MATHEMATICAL models - Abstract
A longitudinal study commonly follows a set of variables, measured for each individual repeatedly over time, and usually suffers from incomplete data problem. A common approach for dealing with longitudinal categorical responses is to use the Generalized Linear Mixed Model (GLMM). This model induces the potential relation between response variables over time via a vector of random effects, assumed to be shared parameters in the non-ignorable missing mechanism. Most GLMMs assume that the random-effects parameters follow a normal or symmetric distribution and this leads to serious problems in real applications. In this paper, we propose GLMMs for the analysis of incomplete multivariate longitudinal categorical responses with a non-ignorable missing mechanism based on a shared parameter framework with the less restrictive assumption of skew-normality for the random effects. These models may contain incomplete data with monotone and non-monotone missing patterns. The performance of the model is evaluated using simulation studies and a well-known longitudinal data set extracted from a fluvoxamine trial is analyzed to determine the profile of fluvoxamine in ambulatory clinical psychiatric practice. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
18. Heteroskedastic linear regression model with compositional response and covariates.
- Author
-
Chen, Jiajia, Zhang, Xiaoqin, and Li, Shengjia
- Subjects
REGRESSION analysis ,HETEROSCEDASTICITY ,LEAST squares ,MATHEMATICAL models ,ANALYSIS of covariance - Abstract
Compositional data are known as a sort of complex multidimensional data with the feature that reflect the relative information rather than absolute information. There are a variety of models for regression analysis with compositional variables. Similar to the traditional regression analysis, the heteroskedasticity still exists in these models. However, the existing heteroskedastic regression analysis methods cannot apply in these models with compositional error term. In this paper, we mainly study the heteroskedastic linear regression model with compositional response and covariates. The parameter estimator is obtained through weighted least squares method. For the hypothesis test of parameter, the test statistic is based on the original least squares estimator and corresponding heteroskedasticity-consistent covariance matrix estimator. When the proposed method is applied to both simulation and real example, we use the original least squares method as a comparison during the whole process. The results implicate the model's practicality and effectiveness in regression analysis with heteroskedasticity. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
19. Varying-coefficient single-index measurement error model.
- Author
-
Zhao, Xin and Huang, Zhensheng
- Subjects
MATHEMATICAL models ,NUMERICAL analysis ,STATISTICAL models ,SIMULATION methods & models ,LEAST squares - Abstract
This paper proposes a varying-coefficient single-index measurement error model, which consists of measurement error in the index covariates. We combine the simulation-extrapolation technique, the local linear regression and the weighted least-squares method to estimate the unknowns of the current model, and develop the asymptotic properties of the resulting estimators under some conditions. A simulation study is conducted to evaluate the proposed methodology, and a real example is also studied to illustrate our given methodology. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
20. Two simple measures of variability for categorical data.
- Author
-
Allaj, Erindi
- Subjects
DATA analysis ,DIFFERENCES ,APPROXIMATION theory ,MODEL validation ,MATHEMATICAL models - Abstract
This paper proposes two new variability measures for categorical data. The first variability measure is obtained as one minus the square root of the sum of the squares of the relative frequencies of the different categories. The second measure is obtained by standardizing the first measure. The measures proposed are functions of the variability measure proposed by Gini [Variabilitá e Mutuabilitá Contributo allo Studio delle Distribuzioni e delle Relazioni Statistiche, C. Cuppini, Bologna, 1912] and approximate the coefficient of nominal variation introduced by Kvålseth [Coefficients of variation for nominal and ordinal categorical data, Percept. Motor Skills 80 (1995), pp. 843-847] when the number of categories increases. Different mathematical properties of the proposed variability measures are studied and analyzed. Several examples illustrate how the variability measures can be interpreted and used in practice. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
21. Causality and interdependence analysis in linear econometric models with an application to fertility.
- Author
-
Barbieri, Laura
- Subjects
ECONOMETRIC models ,ECONOMETRICS ,MATHEMATICAL models ,FEEDBACK control systems ,LOOPS (Group theory) ,STOCHASTIC control theory - Abstract
This paper is an applied analysis of the causal structure of linear multi-equational econometric models. Its aim is to identify the kind of relationships linking the endogenous variables of the model, distinguishing between causal links and feedback loops. The investigation is first carried out within a deterministic framework and then moves on to show how the results may change inside a more realistic stochastic context. The causal analysis is then specifically applied to a linear simultaneous equation model explaining fertility rates. The analysis is carried out by means of a specific RATS programming code designed to show the specific nature of the relationships within the model. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
22. To move or not to move to find a new job: spatial duration time model with dynamic covariate effects.
- Author
-
Kauermann, Göran and Westerheide, Nina
- Subjects
ANALYSIS of covariance ,UNEMPLOYMENT ,MATHEMATICAL models ,RANDOM variables - Abstract
The aim of this paper is to show the flexibility and capacity of penalized spline smoothing as estimation routine for modelling duration time data. We analyse the unemployment behaviour in Germany between 2000 and 2004 using a massive database from the German Federal Employment Agency. To investigate dynamic covariate effects and differences between competing job markets depending on the distance between former and recent working place, a functional duration time model with competing risks is used. It is build upon a competing hazard function where some of the smooth covariate effects are allowed to vary with unemployment duration. The focus of our analysis is on contrasting the spatial, economic and individual covariate effects of the competing job markets and on analysing their general influence on the unemployed's re-employment probabilities. As a result of our analyses, we reveal differences concerning gender, age and education. We also discover an effect between the newly formed and the old West German states. Moreover, the spatial pattern between the considered job markets differs. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
23. Empirical likelihood ratio test for a mean change point model with a linear trend followed by an abrupt change.
- Author
-
Ning, Wei
- Subjects
CHANGE-point problems ,MATHEMATICAL statistics ,NONPARAMETRIC statistics ,MATHEMATICAL models ,HEART transplantation - Abstract
In this paper, a change point model with the mean being constant up to some unknown point, and increasing linearly to another unknown point, then dropping back to the original level is studied. A nonparametric method based on the empirical likelihood test is proposed to detect and estimate the locations of change points. Under some mild conditions, the asymptotic null distribution of an empirical likelihood ratio test statistic is shown to have the extreme distribution. The consistency of the test is also proved. Simulations of the powers of the test indicate that it performs well under different assumptions of the data distribution. The test is applied to the aircraft arrival time data set and the Stanford heart transplant data set. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
24. On the bivariate negative binomial regression model.
- Author
-
Famoye, Felix
- Subjects
REGRESSION analysis ,MATHEMATICAL models ,BINOMIAL distribution ,STATISTICAL correlation ,POISSON processes - Abstract
In this paper, a new bivariate negative binomial regression (BNBR) model allowing any type of correlation is defined and studied. The marginal means of the bivariate model are functions of the explanatory variables. The parameters of the bivariate regression model are estimated by using the maximum likelihood method. Some test statistics including goodness-of-fit are discussed. Two numerical data sets are used to illustrate the techniques. The BNBR model tends to perform better than the bivariate Poisson regression model, but compares well with the bivariate Poisson log-normal regression model. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
25. An improved randomized response model: estimation of mean.
- Author
-
Gjestvang, ChristopherR. and Singh, Sarjinder
- Subjects
ESTIMATION theory ,QUANTITATIVE research ,MATHEMATICAL models ,NUMERICAL analysis ,MATHEMATICAL statistics - Abstract
In this paper, we suggest a new randomized response model useful for collecting information on quantitative sensitive variables such as drug use and income. The resultant estimator has been found to be better than the usual additive randomized response model. An interesting feature of the proposed model is that it is free from the known parameters of the scrambling variable unlike the additive model due to Himmelfarb and Edgell [S. Himmelfarb and S.E. Edgell, Additive constant model: a randomized response technique for eliminating evasiveness to quantitative response questions, Psychol. Bull. 87(1980), 525-530]. Relative efficiency of the proposed model has also been studied with the corresponding competitors. At the end, an application of the proposed model has been discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
26. A Universal QQ-Plot for Continuous Non-homogeneous Populations.
- Author
-
Luceño, Alberto
- Subjects
DISTRIBUTION (Probability theory) ,POISSON processes ,LINEAR statistical models ,MATHEMATICAL statistics ,MATHEMATICAL models ,PROBABILITY theory - Abstract
This article presents a universal quantile-quantile (QQ) plot that may be used to assess the fit of a family of absolutely continuous distribution functions in a possibly non-homogeneous population. This plot is more general than probability plotting papers because it may be used for distributions having more than two parameters. It is also more general than standard quantile-quantile plots because it may be used for families of not-necessarily identical distributions. In particular, the universal QQ plot may be used in the context of non-homogeneous Poisson processes, generalized linear models, and other general models. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
27. Modelling seasonally varying data: A case study for Sudden Infant Death Syndrome (SIDS).
- Author
-
Mooney, JenniferA., Jolliffe, IanT., and Helms, PeterJ.
- Subjects
TIME series analysis ,SUDDEN infant death syndrome ,MATHEMATICAL models ,MATHEMATICAL statistics ,PROBABILITY theory - Abstract
Many time series are measured monthly, either as averages or totals, and such data often exhibit seasonal variability – the values of the series are consistently larger for some months of the year than for others. A typical series of this type is the number of deaths each month attributed to SIDS (Sudden Infant Death Syndrome). Seasonality can be modelled in a number of ways. This paper describes and discusses various methods for modelling seasonality in SIDS data, though much of the discussion is relevant to other seasonally varying data. There are two main approaches, either fitting a circular probability distribution to the data, or using regression-based techniques to model the mean seasonal behaviour. Both are discussed in this paper. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
28. A measure of output gap for Italy through structural time series models.
- Author
-
Zizza, Roberta
- Subjects
TIME series analysis ,MATHEMATICAL statistics ,MATHEMATICAL models ,MATHEMATICAL decomposition - Abstract
The aim of this paper is to achieve a reliable estimate of the output gap for Italy through the development of several models within the class of the unobserved component time series models. These formulations imply the decomposition of output into a trend component (the ‘potential output’) and a cycle component (the ‘output gap’). Both univariate and multivariate methods will be explored. In the former, only one measure of aggregate activity, such as GDP, is considered; in the latter, unemployment and industrial production are introduced. A comparison with alternative measures of output gap, mainly those published by international organisations, will conclude. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
29. Identifying Nonlinear Relationships in Regression using the ACE Algorithm.
- Author
-
Wang, Duolao and Murphy, Michael
- Subjects
REGRESSION analysis ,STATISTICAL correlation ,LINEAR statistical models ,STATISTICS ,ALGORITHMS ,MATHEMATICAL models ,MATHEMATICAL statistics - Abstract
This paper introduces an alternating conditional expectation (ACE) algorithm: a non-parametric approach for estimating the transformations that lead to the maximal multiple correlation of a response and a set of independent variables in regression and correlation analysis. These transformations can give the data analyst insight into the relationships between these variables so that this can be best described and non-linear relationships uncovered. Using the Bayesian information criterion (BIC), we show how to find the best closed-form approximations for the optimal ACE transformations. By means of ACE and BIC, the model fit can be considerably improved compared with the conventional linear model as demonstrated in the two simulated and two real datasets in this paper. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
30. The impact of correlated readings on the estimation of the average area under readers' ROC curves.
- Author
-
Hall, MatthewA. and Mayo, MatthewS.
- Subjects
RADIOLOGY ,RADIATION ,DIAGNOSTIC examinations ,RADIOGRAPHY ,MEDICAL technology ,MATHEMATICAL models ,SIMULATION methods & models ,STANDARD deviations ,PROBABILITY theory ,DATA ,PHYSICS - Abstract
Receiver operating characteristic (ROC) analysis has been used in a variety of settings since it was first declassified by the United States government over 60 years ago. One venue in which it has received particular attention is in the field of radiology. In radiology, as in other areas of application, ROC analysis is used to assess the ability of a diagnostic test to distinguish between two opposing states. One useful descriptor in ROC analysis is the area under the ROC curve. At times, it is useful and insightful to average ROC curves in order to create a single curve that summarizes all of the data from multiple readers. In this paper, we investigate the impact of correlated readings on the average area under two readers' ROC curves using several common averaging strategies, and then apply the results to a radiologic study. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
31. Bayesian modelling of spatial compositional data.
- Author
-
Tjelmeland, Håkon and Lund, Kjetill Vassmo
- Subjects
BAYESIAN analysis ,MATHEMATICAL models - Abstract
Compositional data are vectors of proportions, specifying fractions of a whole. Aitchison (1986) defines logistic normal distributions for compositional data by applying a logistic transformation and assuming the transformed data to be multi-normal distributed. In this paper we generalize this idea to spatially varying logistic data and thereby define logistic Gaussian fields. We consider the model in a Bayesian framework and discuss appropriate prior distributions. We consider both complete observations and observations of subcompositions or individual proportions, and discuss the resulting posterior distributions. In general, the posterior cannot be analytically handled, but the Gaussian base of the model allows us to define efficient Markov chain Monte Carlo algorithms. We use the model to analyse a data set of sediments in an Arctic lake. These data have previously been considered, but then without taking the spatial aspect into account. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
32. Economic design of Xmacron charts with variable parameters: the Markov chain approach.
- Author
-
Costa, Antonio F. B. and Rahim, M. A.
- Subjects
COST control ,MARKOV processes ,MATHEMATICAL models - Abstract
This paper presents an economic design of Xmacron control charts with variable sample sizes, variable sampling intervals, and variable control limits. The sample size n, the sampling interval h, and the control limit coefficient k vary between minimum and maximum values, tightening or relaxing the control. The control is relaxed when an Xmacron value falls close to the target and is tightened when an Xmacron value falls far from the target. A cost model is constructed that involves the cost of false alarms, the cost of finding and eliminating the assignable cause, the cost associated with production in an out-of-control state, and the cost of sampling and testing. The assumption of an exponential distribution to describe the length of time the process remains in control allows the application of the Markov chain approach for developing the cost function. A comprehensive study is performed to examine the economic advantages of varying the Xmacron chart parameters. [ABSTRACT FROM AUTHOR]
- Published
- 2001
- Full Text
- View/download PDF
33. Lasso-type estimation for covariate-adjusted linear model.
- Author
-
Li, Feng and Lu, Yiqiang
- Subjects
LINEAR statistical models ,ASYMPTOTIC distribution ,MATHEMATICAL models ,ASYMPTOTIC expansions ,REGRESSION analysis - Abstract
Lasso is popularly used for variable selection in recent years. In this paper, lasso-type penalty functions including lasso and adaptive lasso are employed in simultaneously variable selection and parameter estimation for covariate-adjusted linear model, where the predictors and response cannot be observed directly and distorted by some observable covariate through some unknown multiplicative smooth functions. Estimation procedures are proposed and some asymptotic properties are obtained under some mild conditions. It deserves noting that under appropriate conditions, the adaptive lasso estimator correctly select covariates with nonzero coefficients with probability converging to one and that the estimators of nonzero coefficients have the same asymptotic distribution that they would have if the zero coefficients were known in advance, i.e. the adaptive lasso estimator has the oracle property in the sense of Fan and Li [6]. Simulation studies are carried out to examine its performance in finite sample situations and the Boston Housing data is analyzed for illustration. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
34. A shared parameter model of longitudinal measurements and survival time with heterogeneous random-effects distribution.
- Author
-
Baghfalaki, Taban, Ganjali, Mojtaba, and Verbeke, Geert
- Subjects
RANDOM effects model ,GAUSSIAN distribution ,LONGITUDINAL method ,MARKOV processes ,BAYESIAN analysis ,AIDS patients ,DIAGNOSIS ,MATHEMATICAL models - Abstract
Typical joint modeling of longitudinal measurements and time to event data assumes that two models share a common set of random effects with a normal distribution assumption. But, sometimes the underlying population that the sample is extracted from is a heterogeneous population and detecting homogeneous subsamples of it is an important scientific question. In this paper, a finite mixture of normal distributions for the shared random effects is proposed for considering the heterogeneity in the population. For detecting whether the unobserved heterogeneity exits or not, we use a simple graphical exploratory diagnostic tool proposed by Verbeke and Molenberghs [34] to assess whether the traditional normality assumption for the random effects in the mixed model is adequate. In the joint modeling setting, in the case of evidence against normality (homogeneity), a finite mixture of normals is used for the shared random-effects distribution. A Bayesian MCMC procedure is developed for parameter estimation and inference. The methodology is illustrated using some simulation studies. Also, the proposed approach is used for analyzing a real HIV data set, using the heterogeneous joint model for this data set, the individuals are classified into two groups: a group with high risk and a group with moderate risk. [ABSTRACT FROM PUBLISHER]
- Published
- 2017
- Full Text
- View/download PDF
35. A general location model with zero-inflated counts and skew normal outcomes.
- Author
-
Mirkamali, Sayed Jamal and Ganjali, Mojtaba
- Subjects
DISCRETE systems ,ZERO-inflated probability distribution ,BINOMIAL distribution ,POISSON regression ,SKEWNESS (Probability theory) ,MATHEMATICAL models - Abstract
This paper proposes an extension of the general location model using a joint model for analyzing inflated counting outcomes and skew continuous outcomes. A zero-inflated binomial with batches of binomials or a zero-inflated Poisson with batches of Poissons is proposed for counting outcome and a skew normal distribution is assumed for continuous outcome. The EM algorithm is developed for estimation of parameters. The accuracy of estimations is evaluated using a simulation study. An application of our models for joint analysis of the number of cigarettes smoked per day and the weights of respondents for the American's Changing Lives study is enclosed. [ABSTRACT FROM PUBLISHER]
- Published
- 2017
- Full Text
- View/download PDF
36. Comparison of asymmetric stochastic volatility models under different correlation structures.
- Author
-
Men, Zhongxian, McLeish, Don, Kolkiewicz, Adam W., and Wirjanto, Tony S.
- Subjects
STOCHASTIC models ,MARKET volatility ,LOGARITHMS ,MATHEMATICAL transformations ,SIMULATION methods & models ,MATHEMATICAL models - Abstract
This paper conducts simulation-based comparison of several stochastic volatility models with leverage effects. Two new variants of asymmetric stochastic volatility models, which are subject to a logarithmic transformation on the squared asset returns, are proposed. The leverage effect is introduced into the model through correlation either between the innovations of the observation equation and the latent process, or between the logarithm of squared asset returns and the latent process. Suitable Markov Chain Monte Carlo algorithms are developed for parameter estimation and model comparison. Simulation results show that our proposed formulation of the leverage effect and the accompanying inference methods give rise to reasonable parameter estimates. Applications to two data sets uncover a negative correlation (which can be interpreted as a leverage effect) between the observed returns and volatilities, and a negative correlation between the logarithm of squared returns and volatilities. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
37. The weighted general linear model for longitudinal medical cost data – an application in colorectal cancer.
- Author
-
Hwang, Y. T., Huang, C. H., Yeh, W. L., and Shen, Y. D.
- Subjects
LINEAR statistical models ,COST effectiveness ,MATHEMATICAL models ,MEDICAL care costs ,LONGITUDINAL method ,PROPORTIONAL hazards models ,PROBABILITY theory - Abstract
Identifying cost-effective decisions that can take into account of medical cost and health outcome is an important issue under very limited resources. Analyzing medical costs has been challenged owing to skewness of cost distributions, heterogeneity across samples and censoring. When censoring is due to administrative reasons, the total cost might be related to the survival time since longer survivals are likely to be censored and the corresponding total cost will be censored as well. This paper uses the general linear model for the longitudinal data to model the repeated medical cost data and the weighted estimating equation is used to find more accurate estimates for the parameter. Furthermore, the asymptotic properties for the proposed model are discussed. Simulations are used to evaluate the performance of estimators under various scenarios. Finally, the proposed model is implemented on the data extracted from National Health Insurance database for patients with the colorectal cancer. [ABSTRACT FROM PUBLISHER]
- Published
- 2017
- Full Text
- View/download PDF
38. Statistical process control and model monitoring.
- Author
-
Harrison, P. J. and Lai, I. C. H.
- Subjects
STATISTICAL process control ,MATHEMATICAL models - Abstract
This paper is concerned with model monitoring and quality control schemes, which are founded on a decision theoretic formulation. After identifying unacceptable weaknesses associated with Wald, sequential probability ratio test (SPRT) and Cuscore monitors, the Bayes decision monitor is developed. In particular, the paper focuses on what is termed a 'popular decision scheme' (PDS) for which the monitoring run loss functions are specified simply in terms of two indiff erence qualities. For most applications, the PDS results in forward cumulative sum tests of functions of the observations. For many exponential family applications, the PDS is equivalent to well-used SPRTs and Cusums. In particular, a neat interpretation of V-mask cusum chart settings is derived when simultaneously running two symmetric PDSs. However, apart from providing a decision theoretic basis for monitoring, sensible procedures occur in applications for which SPRTs and Cuscores are particularly unsatisfactory. Average run lengths (ARLs) are given for two special cases, and the inadequacy of the Wald and similar ARL approximations is revealed. Generalizations and applications to normal and dynamic linear models are discussed. The paper concludes by deriving conditions under which sequences of forward and backward sequential or Cusum chart tests are equivalent. [ABSTRACT FROM AUTHOR]
- Published
- 1999
- Full Text
- View/download PDF
39. Modelling the location decisions of manufacturing firms with a spatial point process approach.
- Author
-
Bocci, Chiara and Rocco, Emilia
- Subjects
ECONOMETRICS ,SPATIAL analysis (Statistics) ,ECOLOGICAL heterogeneity ,MULTIVARIATE analysis ,MANUFACTURING industries ,MATHEMATICAL models ,STORE location - Abstract
The paper is devoted to explore how the increasing availability of spatial micro-data, jointly with the diffusion of GIS software, allows to exploit micro-econometric methods based on stochastic spatial point processes in order to understand the factors that may influence the location decisions of new firms. By using the knowledge of the geographical coordinates of the newborn firms, their spatial distribution is treated as a realization of an inhomogeneous marked point process in the continuous space and the effect of spatial-varying factors on the location decisions is evaluated by parametrically modelling the intensity of the process. The study is motivated by the real issue of analysing the birth process of small and medium manufacturing firms in Tuscany, an Italian region, and it shows that the location choices of the new Tuscan firms is influenced on the one hand by the availability of infrastructures and the level of accessibility, and on the other by the presence and the characteristics of the existing firms. Moreover, the effect of these factors varies with the size and the level of technology of the new firms. Besides the specific Tuscan result, the study shows the potentiality of the described micro-econometric approach for the analysis of the spatial dynamics of firms. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
40. Partitioning gene expression data by data-driven Markov chain Monte Carlo.
- Author
-
Saraiva, E.F., Suzuki, A.K., Louzada, F., and Milan, L.A.
- Subjects
GENE expression ,MARKOV chain Monte Carlo ,BAYESIAN analysis ,GIBBS phenomenon ,ALGORITHM research ,MATHEMATICAL models - Abstract
In this paper we introduce a Bayesian mixture model with an unknown number of components for partitioning gene expression data. Inferences about all the unknown parameters involved are made by using the proposed data-driven Markov chain Monte Carlo. This algorithm is essentially a Metropolis–Hastings within Gibbs sampling. The Metropolis–Hastings is performed to change the number of partitions k in the neighborhood and using a pair of split-merge moves. Our strategy for splitting is based on data in which allocation probabilities are calculated based on marginal likelihood function from the previously allocated observations. Conditional on k, the partitions labels are updated via Gibbs sampling. The two main advantages of the proposed algorithm is that it is easy to be implemented and the acceptance probability for split-merge movements depends only on the observed data. We examine the performance of the proposed algorithm on simulated data and then analyze two publicly available gene expression data sets. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
41. Unified multivariate survival model with a surviving fraction: an application to a Brazilian customer churn data.
- Author
-
Cancho, Vicente G., Dey, Dipak K., and Louzada, Francisco
- Subjects
MULTIVARIATE analysis ,UNIVARIATE analysis ,ANALYSIS of variance ,MAXIMUM likelihood statistics ,ESTIMATION theory ,MATHEMATICAL models - Abstract
In this paper we propose a new lifetime model for multivariate survival data in presence of surviving fractions and examine some of its properties. Its genesis is based on situations in which there are m types of unobservable competing causes, where each cause is related to a time of occurrence of an event of interest. Our model is a multivariate extension of the univariate survival cure rate model proposed by Rodrigues et al. [37]. The inferential approach exploits the maximum likelihood tools. We perform a simulation study in order to verify the asymptotic properties of the maximum likelihood estimators. The simulation study also focus on size and power of the likelihood ratio test. The methodology is illustrated on a real data set on customer churn data. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
42. Factor screening in nonregular two-level designs based on projection-based variable selection.
- Author
-
Tyssedal, John and Hussain, Shahrukh
- Subjects
COMPUTER software ,NOISE ,STATISTICAL reliability ,DISTRIBUTION (Probability theory) ,MEASUREMENT errors ,MATHEMATICAL models - Abstract
In this paper, we focus on the problem of factor screening in nonregular two-level designs through gradually reducing the number of possible sets of active factors. We are particularly concerned with situations when three or four factors are active. Our proposed method works through examining fits of projection models, where variable selection techniques are used to reduce the number of terms. To examine the reliability of the methods in combination with such techniques, a panel of models consisting of three or four active factors with data generated from the 12-run and the 20-run Plackett–Burman (PB) design is used. The dependence of the procedure on the amount of noise, the number of active factors and the number of experimental factors is also investigated. For designs with few runs such as the 12-run PB design, variable selection should be done with care and default procedures in computer software may not be reliable to which we suggest improvements. A real example is included to show how we propose factor screening can be done in practice. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
43. Investigating the determinants of job satisfaction of Italian graduates: a model-based approach.
- Author
-
Capecchi, Stefania and Piccolo, Domenico
- Subjects
MATHEMATICAL models ,JOB satisfaction ,STATISTICAL models ,GRADUATES ,CONCEPTUAL models ,ESTIMATION theory ,EMPLOYMENT - Abstract
The paper explores the relationship between personal, economic and time-dependent covariates as determinants of the job satisfaction expressed by graduate workers. After discussing the main results of the literature, the work emphasizes a statistical modelling approach able to effectively estimate and visualize those determinants and their interactions with subjects' covariates. Interpretation and visualization of graduates' profiles are shown on the basis of a survey conducted in Italy; more specifically, the determinants of both satisfaction and uncertainty of the respondents are explicitly discussed. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
44. Bayesian spatial prediction of skew and censored data via a hybrid algorithm.
- Author
-
Rivaz, Firoozeh and Khaledi, Majid Jafari
- Subjects
SKEWNESS (Probability theory) ,BAYESIAN analysis ,GIBBS sampling ,RANDOM fields ,HYBRID computer simulation ,POLLUTANTS ,MATHEMATICAL models - Abstract
A correct detection of areas with excess of pollution relies first on accurate predictions of pollutant concentrations, a task that is usually complicated by skewed histograms and the presence of censored data. The unified skew-Gaussian (SUG) random field proposed by Zareifard and Jafari Khaledi [19] offers a more flexible class of sampling spatial models to account for skewness. In this paper, we adopt a Bayesian framework to perform prediction for the SUG model in the presence of censored data. Owing to the presence of many latent variables with strongly dependent components in the model, we encounter convergence issues when using Monte Carlo Markov Chain algorithms. To overcome this obstacle, we use a computationally efficient inverse Bayes formulas sampling procedure to obtain approximately independent samples from the posterior distribution of latent variables. Then they are applied to update parameters in a Gibbs sampler scheme. This hybrid algorithm provides effective samples, resulting in some computational advantages and precise predictions. The proposed approach is illustrated with a simulation study and applied to a spatial data set which contains right censored data. [ABSTRACT FROM PUBLISHER]
- Published
- 2015
- Full Text
- View/download PDF
45. Dihedral angles principal geodesic analysis using nonlinear statistics.
- Author
-
Nodehi, A., Golalizadeh, M., and Heydari, A.
- Subjects
PROTEIN structure ,MOLECULAR dynamics ,PRINCIPAL components analysis ,DIHEDRAL angles ,GEODESIC equation ,NONLINEAR analysis ,MATHEMATICAL models - Abstract
Statistics, as one of the applied sciences, has great impacts in vast area of other sciences. Prediction of protein structures with great emphasize on their geometrical features using dihedral angles has invoked the new branch of statistics, known as directional statistics. One of the available biological techniques to predict is molecular dynamics simulations producing high-dimensional molecular structure data. Hence, it is expected that the principal component analysis (PCA) can response some related statistical problems particulary to reduce dimensions of the involved variables. Since the dihedral angles are variables on non-Euclidean space (their locus is the torus), it is expected that direct implementation of PCA does not provide great information in this case. The principal geodesic analysis is one of the recent methods to reduce the dimensions in the non-Euclidean case. A procedure to utilize this technique for reducing the dimension of a set of dihedral angles is highlighted in this paper. We further propose an extension of this tool, implemented in such way the torus is approximated by the product of two unit circle and evaluate its application in studying a real data set. A comparison of this technique with some previous methods is also undertaken. [ABSTRACT FROM PUBLISHER]
- Published
- 2015
- Full Text
- View/download PDF
46. Blind source separation filters-based-fault detection and isolation in a three tank system.
- Author
-
Kouadri, Abdelmalek, Baiche, Karim, and Zelmat, Mimoun
- Subjects
BLIND source separation ,BLIND channel identification (Telecommunications) ,SIGNAL separation ,MATHEMATICAL statistics ,MATHEMATICAL models ,INDEPENDENT component analysis ,MULTIVARIATE analysis - Abstract
Fault detection and Isolation takes a strategic position in modern industrial processes for which various approaches are proposed. These approaches are usually developed and based on a consistency test between an observed state of the process provided by sensors and an expected behaviour provided by a mathematical model of the system. These methods require a reliable model of the system to be monitored which is a complex task. Alternatively, we propose in this paper to use blind source separation filters (BSSFs) in order to detect and isolate faults in a three tank pilot plant. This technique is very beneficial as it uses blind identification without an explicit mathematical model of the system. The independent component analysis (ICA), relying on the assumption of the statistical independence of the extracted sources, is used as a tool for each BSSF to extract signals of the process under consideration. The experimental results show the effectiveness and robustness of this approach in detecting and isolating faults that are on sensors in the system. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
47. Individual-level modeling of the spread of influenza within households.
- Author
-
Malik, Rajat, Deardon, Rob, Kwong, Grace P.S., and Cowling, Benjamin J.
- Subjects
INFLUENZA ,COMMUNICABLE diseases ,MATHEMATICAL models ,MEDICAL statistics - Abstract
A class of individual-level models (ILMs) outlined by R. Deardonet al., [Inference for individual level models of infectious diseases in large populations, Statist. Sin. 20 (2010), pp. 239–261] can be used to model the spread of infectious diseases in discrete time. The key feature of these ILMs is that they take into account covariate information on susceptible and infectious individuals as well as shared covariate information such as geography or contact measures. Here, such ILMs are fitted in a Bayesian framework using Markov chain Monte Carlo techniques to data sets from two studies on influenza transmission within households in Hong Kong during 2008 to 2009 and 2009 to 2010. The focus of this paper is to estimate the effect of vaccination on infection risk and choose a model that best fits the infection data. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
48. Image warping using radial basis functions.
- Author
-
Chen, Ting-Li and Geman, Stuart
- Subjects
IMAGE ,INTERPOLATION ,NUMERICAL analysis ,MATHEMATICAL models ,MATHEMATICAL analysis - Abstract
Image warping is the process of deforming an image through a transformation of its domain, which is typically a subset ofR2. Given the destination of a collection of points, the problem becomes one of finding a suitable smooth interpolation for the destinations of the remaining points of the domain. A common solution is to use the thin plate spline (TPS). We find that the TPS often introduces unintended distortions of image structures. In this paper, we will analyze interpolation by TPS, experiment with other radial basis functions, and suggest two alternative functions that provide better results. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
49. Modelling oil and gas supply disruption risks using extreme-value theory and copula.
- Author
-
Gülpınar, Nalan and Katata, Kabir
- Subjects
EXTREME value theory ,COPULA functions ,MATHEMATICAL models ,VOLUMETRIC analysis ,MULTIVARIATE analysis - Abstract
In this paper, we are concerned with modelling oil and gas supply disruption risks using extreme-value theory and copula. We analyse financial and volumetric losses due to both oil and gas supply disruptions and investigate their dependence structure using real data. In order to illustrate the impact of crude oil and natural gas supply disruptions on an energy-dependent economy, Nigeria is considered as a case study. Computational studies illustrate that the generalized extreme-value distribution anticipates higher financial losses and extreme-value copulas produce the best fit for financial and volumetric losses compared with normal copulas. Moreover, multivariate financial losses exhibit stronger positive dependence than volumetric losses. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
50. Forecasting the industrial production using alternative factor models and business survey data.
- Author
-
Costantini, Mauro
- Subjects
INDUSTRIAL surveys ,INDUSTRIAL production index ,ECONOMIC indicators ,BUSINESS forecasting ,MATHEMATICAL statistics ,NUMERICAL solutions to functional equations ,MATHEMATICAL models - Abstract
This paper compares the forecasting performance of three alternative factor models based on business survey data for the industrial production in Italy. The first model uses static principal component analysis, while the other two apply dynamic principal component analysis in frequency domain and subspace algorithms for state-space representation, respectively. Once the factors are extracted from the business survey data, then they are included into a single equation to predict the industrial production index. The forecast results show that the three factor models have a better performance than that of a simple autoregressive benchmark model regardless of the specification and estimation methods. Furthermore, the state-space model yields superior forecasts amongst the factor models. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.