73 results
Search Results
2. Designing of Fractional Order Bessel Filter using Optimization Techniques.
- Author
-
Soni, Ashu and Gupta, Maneesha
- Subjects
LEAST squares ,SIMULATION methods & models ,MONTE Carlo method ,MATHEMATICAL models - Abstract
This paper proposes the design of an approximated fractional-order low-pass Bessel filter using various optimisation techniques. Simulated annealing (SA), nonlinear least square (NLS), and firefly (FF) optimisation methods are applied for the designing of (1 + α) order low-pass Bessel filter. Stability analysis has been attempted for the proposed fractional-order Bessel filter in W-plane for α ranging from 0.1 to 0.9 in the interval of 0.1. The best optimisation technique in terms of error in gain, cut-off frequency, and stopband attenuation is validated on Biquad. Further, KHN Biquad has been simulated using SPICE to validate 1.2, 1.5, and 1.8 order Bessel filters. MATLAB and SPICE simulated results of gain, cut-off frequency, and stopband attenuation have been compared for the proposed Bessel filter. MATLAB simulated magnitude response shows a great extent of closeness with SPICE simulated magnitude response. Monte Carlo analysis has also been done in the SPICE environment. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
3. Estimation of travel time reliability in large-scale networks.
- Author
-
Babaei, Mohsen, Rajabi-Bahaabadi, Mojtaba, and Shariat-Mohaymany, Afshin
- Subjects
TRANSPORTATION ,TRAVEL time (Traffic engineering) ,RELIABILITY in engineering ,MATHEMATICAL models ,MONTE Carlo method - Abstract
Over the last two decades, travel time reliability has been increasingly used as a key performance indicator of transportation networks. Central to the assessment of travel time reliability is the estimation of the probability mass function (PMF) of total network travel time. This paper aims to present an efficient method to estimate the PMF of total network travel time using the universal generating function (UGF) method. Moreover, the paper proposes two truncation techniques to increase the computational efficiency of the UGF method. In order to assess the applicability of the method in practice, the method is tested on different networks. The results suggest that the method is computationally much more efficient than the standard crude Monte Carlo simulation technique at different confidence levels, and that it can be applied to real-world cases with a reasonable computation time. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
4. Adaptive refined descriptive sampling algorithm for dependent variables using Iman and Conover method in Monte Carlo simulation.
- Author
-
Kebaili, Siham, Ourbih-Tari, Megdouda, Aloui, Abdelouhab, and Guebli, Sofia
- Subjects
MONTE Carlo method ,ALGORITHMS ,DEPENDENT variables ,RANDOM variables ,EXPECTATION-maximization algorithms ,STATISTICAL sampling ,INDEPENDENT variables ,ADAPTIVE sampling (Statistics) ,MATHEMATICAL models - Abstract
This paper deals with Monte Carlo simulation in case of dependent input random variables. We propose an algorithm to generate refined descriptive samples from dependent random variables for estimation of expectations of functions of output variables using the Iman and Conover algorithm to transform the dependent variables to independent ones. Therefore, such estimates obtained through a chosen mathematical model are compared with those obtained using the simple random sampling method, which proved that the former are the most efficient. Besides, using already published work on independent input variables, we can deduce in case of dependent input random variables, that asymptotically the variance of the RDS estimator is less than that of SRS estimator for any simulation function having finite second moment. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
5. Critical values improvement for the standard normal homogeneity test by combining Monte Carlo and regression approaches.
- Author
-
Rienzner, Michele and Ieva, Francesca
- Subjects
MONTE Carlo method ,HOMOGENEITY ,REGRESSION analysis ,MATHEMATICAL models ,NUMERICAL analysis ,PROBABILITY theory - Abstract
The distribution of the test statistics of homogeneity tests is often unknown, requiring the estimation of the critical values through Monte Carlo (MC) simulations. The computation of the critical values at lowα, especially when the distribution of the statistics changes with the series length (sample cardinality), requires a considerable number of simulations to achieve a reasonable precision of the estimates (i.e. 106simulations or more for each series length). If, in addition, the test requires a noteworthy computational effort, the estimation of the critical values may need unacceptably long runtimes. To overcome the problem, the paper proposes a regression-based refinement of an initial MC estimate of the critical values, also allowing an approximation of the achieved improvement. Moreover, the paper presents an application of the method to two tests: SNHT (standard normal homogeneity test, widely used in climatology), and SNH2T (a version of SNHT showing a squared numerical complexity). For both, the paper reports the critical values forαranging between 0.1 and 0.0001 (useful for thep-value estimation), and the series length ranging from 10 (widely adopted size in climatological change-point detection literature) to 70,000 elements (nearly the length of a daily data time series 200 years long), estimated with coefficients of variation within 0.22%. For SNHT, a comparison of our results with approximated, theoretically derived, critical values is also performed; we suggest adopting those values for the series exceeding 70,000 elements. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
6. Bayesian inference approach to inverse problems in a financial mathematical model.
- Author
-
Ota, Yasushi, Jiang, Yu, Nakamura, Gen, and Uesaka, Masaaki
- Subjects
MONTE Carlo method ,MATHEMATICAL models ,MARKOV chain Monte Carlo ,PROBABILITY density function ,BLACK-Scholes model ,INVERSE problems - Abstract
This paper investigates an inverse problem of option pricing in the extended Black–Scholes model. We identify the model coefficients from the measured data and attempt to find arbitrage opportunities in financial markets using a Bayesian inference approach. The posterior probability density function of the parameters is computed from the measured data. The statistics of the unknown parameters are estimated by a Markov Chain Monte Carlo (MCMC), which explores the posterior state space. The efficient sampling strategy of an MCMC enables us to solve inverse problems by the Bayesian inference technique. Our numerical results indicate that the Bayesian inference approach can simultaneously estimate the unknown drift and volatility coefficients from the measured data. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
7. Hedging of contingent claims written on non traded assets under Markov-modulated models.
- Author
-
Wang, Wei, Qian, Linyi, and Wang, Wensheng
- Subjects
MARKOV processes ,STOCK prices ,HEDGING (Finance) ,ASSETS (Accounting) ,INVESTORS ,MONTE Carlo method ,MATHEMATICAL models - Abstract
This paper studies the hedging problem of European contingent claims when the underlying asset is non traded. We assume that the share prices of the assets are governed by Markov-modulated processes; that is, the market parameters switch over the time according to a finite-state continuous time Markov chain. Due to the presence of Markov chain the non traded asset, the market which we consider is incomplete, we shall use the local risk minimization method to obtain an optimal hedging strategy in a closed-form for an investor. Finally, numerical illustrations of an optimal hedging strategy are given by the Monte Carlo simulation. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
8. Weighted adaptive multivariate CUSUM charts with variable sampling intervals.
- Author
-
Haq, Abdul
- Subjects
MONTE Carlo method ,MATHEMATICAL models ,STATISTICAL sampling ,PROBABILITY theory ,MULTIVARIATE analysis ,BIVARIATE analysis - Abstract
The adaptive multivariate CUSUM (AMCUSUM) chart has received considerable attention because of its superior sensitivity against a range of mean shift sizes than that of the conventional non-adaptive multivariate CUSUM (MCUSUM) chart. Recently, weighted AMCUSUM (WAMCUSUM) charts with a fixed sampling interval (FSI) have been proposed, called the WAMCUSUM-FSI charts, which provide more sensitivity than the AMCUSUM-FSI charts. In this paper, we extend this work and propose WAMCUSUM charts with variable sampling interval (VSI), named the WAMCUSUM-VSI charts, for efficiently monitoring the mean of a multivariate normally distributed process. The Monte Carlo simulation method is used to compute the average time to signal (ATS) and the adjusted ATS (AATS) profiles of the existing and proposed charts. It is found that the WAMCUSUM-VSI charts perform substantially and nearly uniformly better than the WAMCUSUM-FSI charts in terms of the ATS and AATS performance criterion. An example is given to explain the implementation of the WAMCUSUM charts with fixed and VSIs. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
9. A new adaptive EWMA control chart using auxiliary information for monitoring the process mean.
- Author
-
Haq, Abdul
- Subjects
QUALITY control charts ,INFORMATION processing ,MONTE Carlo method ,STATISTICAL process control ,MATHEMATICAL models - Abstract
The adaptive memory-type control charts, including the adaptive exponentially weighted moving average (EWMA) and cumulative sum (CUSUM) charts, have gained considerable attention because of their excellent speed in providing overall good detection over a range of mean shift sizes. In this paper, we propose a new adaptive EWMA (AEWMA) chart using the auxiliary information for efficiently monitoring the infrequent changes in the process mean. The idea is to first estimate the unknown process mean shift using an auxiliary information based mean estimator, and then adaptively update the smoothing constant of the EWMA chart. Using extensive Monte Carlo simulations, the run length profiles of the AEWMA chart are computed and explored. The AEWMA chart is compared with the existing control charts, including the classical EWMA, CUSUM, synthetic EWMA and synthetic CUSUM charts, in terms of the run length characteristics. It turns out that the AEWMA chart performs uniformly better than these control charts when detecting a range of mean shift sizes. An illustrative example is also presented to demonstrate the working and implementation of the proposed and existing control charts. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
10. Bias correction for time series factor models.
- Author
-
Alonso, Andrés M., Bastos, Guadalupe, and García-Martos, Carolina
- Subjects
BIAS correction (Topology) ,BASES (Linear topological spaces) ,TIME series analysis ,MATHEMATICAL models ,MONTE Carlo method - Abstract
In this paper we work with multivariate time series that follow a Factor Model. In particular, we consider the setting where factors are dominated by highly persistent AutoRegressive (AR) processes and samples that are rather small. Therefore, the factors' AR models are estimated using small sample bias correction techniques. A Monte Carlo study reveals that bias-correcting the AR coefficients of the factors allows to obtain better results in terms of prediction interval coverage. As expected, the simulation shows that bias-correction is more successful for smaller samples. We present the results assuming the AR order and number of factors are known as well as unknown. We also study the advantages of this technique for a set of Industrial Production Indexes of several European countries. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
11. Monitoring means and covariances of multivariate non linear time series with heavy tails.
- Author
-
Garthoff, Robert and Schmid, Wolfgang
- Subjects
STATISTICAL process control ,MULTIVARIATE analysis ,COVARIANCE matrices ,MONTE Carlo method ,STATISTICAL correlation ,MATHEMATICAL models - Abstract
In this paper, the focus is on sequential analysis of multivariate financial time series with heavy tails. The mean vector and the covariance matrix of multivariate non linear models are simultaneously monitored by modifying conventional control charts to identify structural changes in the data. The considered target process is a constant conditional correlation model (cf. Bollerslev, 1990), an extended constant conditional correlation model (cf. He and Teräsvirta, 2004), a dynamic conditional correlation model (cf. Engle, 2002), or a generalized dynamic conditional correlation model (cf. Capiello et al., 2006). For statistical surveillance we use control charts based on residuals. Further, the procedures are constructed fort-distribution. The detection speed of these charts is compared via Monte Carlo simulation. In the empirical study, the procedure with the best performance is applied to log-returns of the stock market indices FTSE and CAC. [ABSTRACT FROM PUBLISHER]
- Published
- 2017
- Full Text
- View/download PDF
12. Using Inverse Probability Weighting Estimators to Evaluate Various Propensity Scores When Treatment Switching Exists.
- Author
-
Tu, Chunhao and Koh, Woon Yuen
- Subjects
STATISTICAL weighting ,MONTE Carlo method ,MEAN square algorithms ,STATISTICAL sampling ,MATHEMATICAL models - Abstract
In this paper, we conduct a Monte Carlo simulation study to evaluate three propensity score (PS) scenarios for estimating an average treatment effect (ATE) in observational studies when treatment switching exists: (a) ignoring treatment switching in subjects (UPS), (b) removing subjects with treatment switching (RPS), and (c) adjusting for treatment switching effect (APS) with two inverse probability weighting estimators, IPW1 and IPW2. We evaluate these six estimators in terms of bias, mean squared error (MSE), empirical standard error (ESE), and coverage probability (CP) under various simulation scenarios. Simulation results show that the IPW2 estimator with RPS has relatively good performance. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
13. The Performance of Lag Selection and Detrending Methods for HEGY Seasonal Unit Root Tests.
- Author
-
del Barrio Castro, Tomás, Osborn, DeniseR., and Taylor, A.M. Robert
- Subjects
LEAST squares ,MONTE Carlo method ,APPROXIMATION theory ,PERFORMANCE standards ,MATHEMATICAL models - Abstract
This paper analyzes two key issues for the empirical implementation of parametric seasonal unit root tests, namely generalized least squares (GLS) versus ordinary least squares (OLS) detrending and the selection of the lag augmentation polynomial. Through an extensive Monte Carlo analysis, the performance of a battery of lag selection techniques is analyzed, including a new extension of modified information criteria for the seasonal unit root context. All procedures are applied for both OLS and GLS detrending for a range of data generating processes, also including an examination of hybrid OLS-GLS detrending in conjunction with (seasonal) modified AIC lag selection. An application to quarterly U.S. industrial production indices illustrates the practical implications of choices made. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
14. Fitted finite volume method for pricing CO 2 futures option based on the underlying with non-log-normal distribution.
- Author
-
Chang, Shuhua and Li, Jinghuan
- Subjects
FINITE volume method ,DYNAMICAL systems ,MATHEMATICAL models ,MONTE Carlo method ,STOCHASTIC convergence - Abstract
By analysing spot price characteristics at compliance time from equilibrium perspective, the COprice dynamical models are obtained for both bankable and nonbankable regulations with the prices of spots and futures being non-log-normal, and the futures option pricing model is established. Then, a fitted finite volume method (FVM) is proposed to solve the pricing model. Also, its monotonicity, stability, and convergence are studied. Furthermore, the convergence rates are compared between our fitted FVM and the finite difference method. Moreover, the Δ-hedge experiments are performed, in which the hedge cost and the hedge error are estimated. Finally, using the Monte Carlo's pricing results as a benchmark, we compare the computational results from the fitted FVM and the Black formula approximation, respectively, which shows that the former is fitted for the Monte Carlo better than the latter, especially for the futures options with the bigger volatility or the maturity near the compliance time. [ABSTRACT FROM PUBLISHER]
- Published
- 2015
- Full Text
- View/download PDF
15. A control chart using an auxiliary variable and repetitive sampling for monitoring process mean.
- Author
-
Lee, Hyeseon, Aslam, Muhammad, Shakeel, Qurat-ul-ain, Lee, Wonji, and Jun, Chi-Hyuck
- Subjects
STATISTICAL sampling ,MONTE Carlo method ,QUALITY control charts ,MATHEMATICAL variables ,MATHEMATICAL models - Abstract
In this paper, a new control chart is proposed by using an auxiliary variable and repetitive sampling in order to enhance the performance of detecting a shift in process mean. The product-difference type estimator of the mean is plotted on the proposed control chart, which utilizes the information of an auxiliary variable correlated with the main quality variable. The proposed control chart is based on the outer and inner control limits so that repetitive sampling is allowed when the plotted statistic falls between the two limits. The average run length (ARL) of the proposed control chart is evaluated using the Monte Carlo simulation. The proposed control chart is compared with the Riaz M control chart and the results show the outperformance of the proposed control chart in terms of the ARL. [ABSTRACT FROM PUBLISHER]
- Published
- 2015
- Full Text
- View/download PDF
16. Application of shrinkage estimation in linear regression models with autoregressive errors.
- Author
-
Thomson, T., Hossain, S., and Ghahramani, M.
- Subjects
REGRESSION analysis ,AUTOREGRESSION (Statistics) ,MONTE Carlo method ,ESTIMATION theory ,ASYMPTOTIC distribution ,MATHEMATICAL models - Abstract
In this paper, we consider the shrinkage and penalty estimation procedures in the linear regression model with autoregressive errors of orderpwhen it is conjectured that some of the regression parameters are inactive. We develop the statistical properties of the shrinkage estimation method including asymptotic distributional biases and risks. We show that the shrinkage estimators have a significantly higher relative efficiency than the classical estimator. Furthermore, we consider the two penalty estimators: least absolute shrinkage and selection operator (LASSO) and adaptive LASSO estimators, and numerically compare their relative performance with that of the shrinkage estimators. A Monte Carlo simulation experiment is conducted for different combinations of inactive predictors and the performance of each estimator is evaluated in terms of the simulated mean-squared error. This study shows that the shrinkage estimators are comparable to the penalty estimators when the number of inactive predictors in the model is relatively large. The shrinkage and penalty methods are applied to a real data set to illustrate the usefulness of the procedures in practice. [ABSTRACT FROM PUBLISHER]
- Published
- 2015
- Full Text
- View/download PDF
17. Application of a Monte Carlo model to predict space heating energy use of Belgrade's housing stock.
- Author
-
Kavgic, Miroslava, Summerfield, Alex, Mumovic, Dejan, and Stevanovic, Zarko
- Subjects
SPACE heaters ,ENERGY consumption ,MATHEMATICAL models ,MONTE Carlo method ,MATHEMATICAL models of uncertainty ,HOUSING - Abstract
Detailed domestic stock energy models can be used to help formulate optimum energy reduction strategies. However, there will always be considerable uncertainty related to their predictions due to the complexity of the housing stock and the many assumptions required to implement the models. This paper presents a simple Monte Carlo (MC) model that can be easily extended and/or transformed in relation to data available for investigating and quantifying uncertainties in both the housing stock model's predictions and scenario assumptions. While 90% of the MC model predictions fell within a range which is ±19% the mean value, 50% of them were within ±8% of the mean. The findings suggest that the uncertainties associated with the model predictions and scenario assumptions need to be acknowledged fully and – where possible – quantified as even fairly small variability in the influential variables may result in rather large uncertainty in the aggregated model's prediction. [ABSTRACT FROM PUBLISHER]
- Published
- 2015
- Full Text
- View/download PDF
18. Profile Hellinger distance estimation.
- Author
-
Wu, Jingjing and Karunamuni, Rohana J.
- Subjects
PARAMETER estimation ,ROBUST statistics ,MAXIMUM likelihood statistics ,MONTE Carlo method ,ASYMPTOTIC efficiencies ,MATHEMATICAL models - Abstract
The successful application of the Hellinger distance approach to fully parametric models is well known. The corresponding optimal estimators, known as minimum Hellinger distance (MHD) estimators, are efficient and have excellent robustness properties [Beran R. Minimum Hellinger distance estimators for parametric models. Ann Statist. 1977;5:445–463]. This combination of efficiency and robustness makes MHD estimators appealing in practice. However, their application to semiparametric statistical models, which have a nuisance parameter (typically of infinite dimension), has not been fully studied. In this paper, we investigate a methodology to extend the MHD approach to general semiparametric models. We introduce theprofile Hellinger distanceand use it to constructa minimum profile Hellinger distanceestimator of the finite-dimensional parameter of interest. This approach is analogous in some sense to theprofile likelihood approach. We investigate the asymptotic properties such as the asymptotic normality, efficiency, and adaptivity of the proposed estimator. We also investigate its robustness properties. We present its small-sample properties using a Monte Carlo study. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
19. Bootstrap order selection for SETAR models.
- Author
-
Fenga, Livio and Politis, Dimitris N.
- Subjects
STATISTICAL bootstrapping ,MONTE Carlo method ,INFORMATION theory ,AUTOREGRESSIVE models ,MATHEMATICAL models ,ARTIFICIAL neural networks - Abstract
In this paper, we investigate the selecting performances of a bootstrapped version of the Akaike information criterion for nonlinear self-exciting threshold autoregressive-type data generating processes. Empirical results will be obtained via Monte Carlo simulations. The quality of our method is assessed by comparison with its non-bootstrap counterpart and through a novel procedure based on artificial neural networks. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
20. On periodic autoregressive stochastic volatility models: structure and estimation.
- Author
-
Boussaha, Nadia and Hamdi, Fayçal
- Subjects
MARKET volatility ,MONTE Carlo method ,KALMAN filtering ,ESTIMATION theory ,TIME series analysis ,MATHEMATICAL models - Abstract
To capture both the volatility evolution and the periodicity feature in the autocorrelation structure exhibited by many nonlinear time series, a Periodic AutoRegressive Stochastic Volatility (
PAR -SV ) model is proposed. Some probabilistic properties, namely the strict and second-order periodic stationarity, are provided. Furthermore, conditions for the existence of higher-order moments are established. The autocovariance structure of the squares and higher order powers of thePAR -SV process is studied. Its dynamic properties are shown to be consistent with financial time series empirical findings. Ways in which the model may be estimated are discussed. Finally, a simulation study of the performance of the proposed estimation methods is provided and thePAR -SV is applied to model the spot rates of the euro and US dollar both against the Algerian dinar. The empirical analysis shows that the proposedPAR -SV model can be considered as a viable alternative to the periodic generalized autoregressive conditionally heteroscedastic (PGARCH ) model. [ABSTRACT FROM AUTHOR]- Published
- 2018
- Full Text
- View/download PDF
21. Acceptance Criteria of Bearings-only Passive Target Tracking Solution.
- Author
-
Koteswara Rao, S., Kavitha Lakshmi, M., Jahan, Kausar, Naga Divya, G., and Omkar Lakshmi Jagan, B.
- Subjects
MONTE Carlo method ,KALMAN filtering ,SUBMARINES (Ships) ,COVARIANCE matrices ,STANDARD deviations ,MATHEMATICAL models - Abstract
In passive target tracking, target motion parameters (TMP) i.e. range, course, and speed are estimated using bearing measurements. In the simulation mode, the accuracy of the estimated solution can be calculated, as true values are available. In the actual scenario, the true values of TMP are not readily obtainable. In this research, the focus is laid on the evolution of the acceptance criterion of the TMP in the actual scenario. Detailed mathematical modeling is carried out and an unscented Kalman filter (UKF) is used to estimate TMP. The state vector covariance matrix of UKF is utilized to express the errors in estimated TMP and an acceptance criterion is derived in terms of the standard deviation of errors in estimated TMP. Submarine to submarine tracking is carried out and scenarios are chosen accordingly. Several low angle on target bow (ATB) scenarios are used in the Monte-Carlo simulation and the results are presented. A similar procedure can be carried out for medium and high ATB scenarios and a submarine to ship or vice-versa passive target tracking. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
22. Methods for probability distributions estimation of indoor environmental parameters and long-term IEQ assessment.
- Author
-
Wu, Hangzi, Sun, Xiaoying, and Wu, Yue
- Subjects
MONTE Carlo method ,PROBABILITY density function ,TIME series analysis ,MATHEMATICAL models ,ENVIRONMENTAL quality ,DISTRIBUTION (Probability theory) - Abstract
To overcome the difficulties in comparing enormous time series for indoor environmental parameters and make use of technological developments in previous research, this study considers environmental parameters from a new perspective, their probability density functions (PDFs). PDFs are combined with existing indoor environmental quality (IEQ) mathematical models to assess the long-term IEQ. Two methodologies were developed: probability distribution estimation of indoor environmental parameters and long-term IEQ distribution assessment based on the Monte Carlo method. The effectiveness of the developed methodologies was illustrated in a three-month IEQ assessment of an office. PDFs were obtained with specific mathematical expressions for the three-month air temperature, sound level and illuminance. The three-month distributions for thermal, visual, acoustic and overall environmental quality were presented using eight previous IEQ mathematical models. PDFs have the advantage of using only a few main parameters instead of an enormous time series to explain the behaviour and characteristics of environmental parameters. PDFs can also potentially determine the commonalities of environmental distributions for different buildings. The obtained IEQ distributions present a straightforward and comprehensive impression of the long-term IEQ, rather than a simple index. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
23. Secrecy Analysis with MRC/SC-Based Eavesdropper over Heterogeneous Channels.
- Author
-
Tran, Duc-Dung, Ha, Dac-Binh, Tran-Ha, Vu, and Hong, Een-Kee
- Subjects
INFORMATION theory ,RADIO transmitter fading ,SIGNAL-to-noise ratio ,HETEROGENEOUS computing ,MONTE Carlo method ,MATHEMATICAL models - Abstract
In the perspective of information-theoretical security, inspired by the heterogeneity between legal and illegal channels, we consider the physical layer secrecy performance of a novel system model with a single antenna employed for both the transmitter and legal receiver, in the presence of a multiple antennas passive eavesdropper over dissimilar fading channels, where the legal/illegal channels are subject to Rayleigh/Nakagami-m and Nakagami-m/Rayleigh fading, respectively. We assume that the eavesdropper employs either maximal-ratio combining or selection combining reception in order to benefit from space diversity. Specifically, the exact closed-form expressions of the probability of existence of secrecy capacity and the secrecy outage probability are derived by utilizing statistical characteristics of the signal-to-noise ratio. Moreover, the delightful results reveal that the secrecy performance achieves a greater level when the Nakagami-m fading is supposed on the main link due to the line of sight component. In final, the analytical results are verified by Monte-Carlo simulation. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
24. Correlation intensity index: mathematical modeling of cytotoxicity of metal oxide nanoparticles.
- Author
-
Ahmadi, Shahin, Toropova, Alla P., and Toropov, Andrey A.
- Subjects
MONTE Carlo method ,METAL nanoparticles ,METALLIC oxides ,QSAR models ,MATHEMATICAL models ,STRUCTURE-activity relationships - Abstract
Metal oxide nanoparticles (MO-NPs) have unique structural characteristics, exceptionally high surface area, strong mechanical stability, catalytic activities, and are biocompatible. Consequently, MO-NPs have recently attracted considerable interest in the field of imaging-guided therapeutic and biosensing applications. This study aims to develop Quantitative Structure–Activity Relationships (QSAR) for the prediction of cell viability of MO-NPs. The QSAR model based on the so-called optimal descriptors which calculated with a simplified molecular input-line entry system (SMILES). The Monte Carlo technique applied to calculate correlation weights for SMILES fragments. Factually, the optimal descriptor for SMILES is the summation of the correlation weights. The model of cytotoxicity is one variable correlation between cytotoxicity and the above optimal descriptor. The Correlation Intensity Index (CII) is a possible criterion of the predictive potential of the model. Applying the CII as a component of the target function in the Monte Carlo optimization routine, employed by the CORAL program, that is designed to find a predictive relationship between the optimal descriptor and cytotoxicity of MO-NPs, improves the statistical quality of the model. The significance of different eclectic features, in terms of whether they increase/decrease cell viability, i.e. decrease or increase cytotoxicity, is also discussed. Numerical data on 83 experimental samples of MO-NPs activity under different conditions taken from the literature are applied for the "nano-QSAR" analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
25. New two-stage sampling designs based on neoteric ranked set sampling.
- Author
-
Taconeli, Cesar Augusto and Cabral, Angelo da Silva
- Subjects
SIMULATION methods & models ,MONTE Carlo method ,PINE ,MEAN square algorithms ,MATHEMATICAL models - Abstract
Neoteric ranked set sampling (NRSS) is a recently developed sampling plan, derived from the well-known ranked set sampling (RSS) scheme. It has already been proved that NRSS provides more efficient estimators for population mean and variance compared to RSS and other sampling designs based on ranked sets. In this work, we propose and evaluate the performance of some two-stage sampling designs based on NRSS. Five different sampling schemes are proposed. Through an extensive Monte Carlo simulation study, we verified that all proposed sampling designs outperform RSS, NRSS, and the original double RSS design, producing estimators for the population mean with a lower mean square error. Furthermore, as with NRSS, two-stage NRSS estimators present some bias for asymmetric distributions. We complement the study with a discussion on the relative performance of the proposed estimators. Moreover, an additional simulation based on data of the diameter and height of pine trees is presented. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
26. The Importance of Scale in Spatially Varying Coefficient Modeling.
- Author
-
Murakami, Daisuke, Lu, Binbin, Harris, Paul, Brunsdon, Chris, Charlton, Martin, Nakaya, Tomoki, and Griffith, Daniel A.
- Subjects
STATISTICS ,REGRESSION analysis ,MULTIVARIATE analysis ,MONTE Carlo method ,MATHEMATICAL models ,NUMERICAL analysis - Abstract
Copyright of Annals of the American Association of Geographers is the property of Taylor & Francis Ltd and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2019
- Full Text
- View/download PDF
27. Classical density functional theory meets Monte-Carlo simulations.
- Author
-
Roth, Roland and Hansen-Goos, Hendrik
- Subjects
DENSITY functional theory ,MONTE Carlo method ,COMPUTER simulation ,DISTRIBUTION (Probability theory) ,MATHEMATICAL models ,PROBABILITY theory - Abstract
Typically the quality of an approximate density functional is evaluated by a direct comparison of its predictions in a given test case to exact data obtained by computer simulations. An important example for such an approach is the test of equilibrium structure of a simple fluid as measured by the pair distribution function g(r) or the cavity correlation function y(r). However, the combination of exact density profiles and the analytical structure of density functional theory allows one to determine and potentially improve the quality of a functional in a more sophisticatedway. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
28. Sparsity identification for high-dimensional partially linear model with measurement error.
- Author
-
Li, Rui and Zhao, Haibing
- Subjects
LINEAR statistical models ,MATHEMATICAL models ,MATHEMATICAL statistics ,ESTIMATION theory ,MONTE Carlo method - Abstract
In this article, we studied the identification of significant predictors in partially linear model in which some regressors are contaminated with random errors. Moreover, the dimension of parametric component is divergent and the regression coefficients are sparse. We applied difference technique to remove the nonparametric component for circumventing the selection of bandwidth, and constructed a bias-corrected shrinking estimator for the coefficient by using smoothly clipped absolute deviation (SCAD) penalty. Then, we derived the estimating and selecting consistency and established the asymptotic distribution for the identified significant estimators. Finally, Monte Carlo studies illustrate the performance of our approach. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
29. Estimating ‘gamma’ for tail-hedge discount rates when project returns are cointegrated with GDP.
- Author
-
Mantalos, Panagiotis and Hultkrantz, Lars
- Subjects
INVESTMENTS ,GROSS domestic product ,MONTE Carlo method ,GROSS national product ,MATHEMATICAL models - Abstract
Martin Weitzman has suggested a method for calculating social discount rates for long-term investments when project returns are covariant with consumption or other macroeconomic variables, so-called ‘tail-hedge discounting’. This method relies on a parameter called ‘real project gamma’ that measures the proportion of project returns that is covariant with the macroeconomic variable. We compare two approaches for estimation of this gamma when the project returns and the macroeconomic variable are cointegrated. First, we use Weitzman’s own approach, and second a simple data transformation that keeps gamma within the zero to one interval. In a Monte-Carlo study, we show that the method of using a standardized series is better and robust under different data-generating processes. Both approaches are examined in a Monte-Carlo experiment and applied to Swedish time-series data from 1950-2011 for annual time-series data for rail freight (a measure of returns from rail investments) and GDP. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
30. A statistical approach to estimate state variables in flow-accelerated corrosion problems.
- Author
-
de Moura, Bruno Furtado, da Silva, Wellington Betencurte, de Macêdo, Marcelo Camargo Severo, and Martins, Márcio Ferreira
- Subjects
CORROSION & anti-corrosives ,CORROSION resistance ,BAYESIAN analysis ,MONTE Carlo method ,SEQUENTIAL analysis ,ACCURACY ,MATHEMATICAL models - Abstract
Sequential Monte Carlo or Particle Filter Methods have been widely used to deal with sequential Bayesian inference problems in several fields of knowledge. This technique involves approximation of probability sequences distributions of interest, by means of a large set of random samples, i.e. particles that are propagated along time with a simple Sampling Importance distribution, SI. A re-sampling technique is also used to improve the predictive probability. In this study, a methodology is proposed: apply the Bayesian filters to a state estimation problem involving the corrosion amount-time in a contraction-expansion geometry with the aid of Computational Fluid Dynamics to improve the accuracy of the results. The following filters were applied and compared: Sampling Importance Re-sampling filter (SIR filter) and Auxiliary Sampling Importance Re-sampling filter (ASIR filter). The corrosion model adopted is based on a double resistance due to the oxygen diffusion towards the wall through the hydrodynamic boundary layer and the oxide layer. Mass loss data over time are obtained from the literature to compare corrosion rates. Also, the influence of the corrosion products in rates of corrosion is discussed. Best results in corrosion damage estimation were obtained using the ASIR filter. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
31. Testing homogeneity of difference of two proportions for stratified correlated paired binary data.
- Author
-
Shen, Xi and Ma, Chang-Xing
- Subjects
STATISTICS ,MONTE Carlo method ,DATA analysis ,MATHEMATICAL models ,SIMULATION methods & models - Abstract
In ophthalmologic or otolaryngologic study, each subject may contribute paired organs measurements to the analysis. A number of statistical methods have been proposed on bilateral correlated data. In practice, it is important to detect confounding effect by treatment interaction, since ignoring confounding effect may lead to unreliable conclusion. Therefore, stratified data analysis can be considered to adjust the effect of confounder on statistical inference. In this article, we investigate and derive three test procedures for testing homogeneity of difference of two proportions for stratified correlated paired binary data in the basis of equal correlation model assumption. The performance of proposed test procedures is examined through Monte Carlo simulation. The simulation results show that the Score test is usually robust on type I error control with high power, and therefore is recommended among the three methods. One example from otolaryngologic study is given to illustrate the three test procedures. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
32. Extropy estimators with applications in testing uniformity.
- Author
-
Qiu, Guoxin and Jia, Kai
- Subjects
MONTE Carlo method ,SPACING (Orthography) ,ORTHOGRAPHY & spelling ,MATHEMATICAL models ,DATA analysis - Abstract
Two estimators for estimating the extropy of an absolutely continuous random variable with known support were introduced by using spacing. It is shown that the proposed estimators are consistent and their mean square errors are shift invariant. Their behaviours were also studied by means of real data and Monte Carlo simulation. The winner estimator of extropy in the Monte Carlo experiment was used to develop goodness-of-fit test for standard uniform distribution. It is shown that the extropy-based test that we proposed performs well by comparing its powers with that of other tests for uniformity. [ABSTRACT FROM PUBLISHER]
- Published
- 2018
- Full Text
- View/download PDF
33. Two and three-particles spatial correlation in weak coupling plasma and applications.
- Author
-
Ababsa, H., Meftah, M. T., and Chohra, T.
- Subjects
STATISTICAL correlation ,MONTE Carlo method ,ALGORITHMS ,COMPUTER simulation ,MATHEMATICAL models - Abstract
In this work, we studied the pair and triplet correlation functions in plasma composed of particles interacting via the Debye screening potential in the conditions of weak coupling parameter (Γ ≤ 1). The pair correlation functions are analysed numerically and compared to the hypernetted chains data, to the Monte Carlo data and to the results of Abramo and Tosi The triplet correlation functions are also analysed numerically and compared to the Kirkwood superposition approximation. At the end of our work, we give some applications to static structure factor and the dielectric constant. The results are well satisfactory. [ABSTRACT FROM PUBLISHER]
- Published
- 2018
- Full Text
- View/download PDF
34. Role of the surface anchoring energy on the spontaneous modulated pattern formation of hybrid aligned cholesteric liquid crystals.
- Author
-
Biagio, R. L., de Souza, R. Teixeira, Evangelista, L. R., and Zola, R. S.
- Subjects
CHOLESTERIC liquid crystals ,LIQUID crystals ,MONTE Carlo method ,ENERGY density ,DIFFRACTION gratings ,OPTICAL devices ,MATHEMATICAL models - Abstract
This work focuses on the striped pattern formation of cholesteric liquid crystals confined in a hybrid aligned cell. The study is performed by means of Monte Carlo simulations. This pattern formation was studied with a chiral molecular additive pair potential proposed for cholesterics by discretizing the Frank free energy density. Experimentally, the thickness of slab to cholesteric pitch ratio (d/p0) is fundamental to describe how the stripes will behave, but no deep studies on the role of anchoring energy have been reported to date. According to simulation results, it is possible to observe a textural transition that depends both on the material's physical parameters and the anchoring strength. Such transition, from planar to a spontaneous striped pattern structure, occurs when the anchoring energy at the homeotropic surface reaches a critical valueJcrit. Moreover, the oscillation amplitudes of directors immediately below the top surface changes with anchoring energy and with the physical parameters of the host liquid crystals. [ABSTRACT FROM PUBLISHER]
- Published
- 2017
- Full Text
- View/download PDF
35. Testing exponentiality based on Kullback—Leibler information for progressively Type II censored data.
- Author
-
Noughabi, Hadi Alizadeh
- Subjects
ENTROPY ,MONTE Carlo method ,ARTIFICIAL neural networks ,TIME series analysis ,NONLINEAR theories ,MATHEMATICAL models - Abstract
In many life-testing and reliability experiments, data are often censored in order to reduce the cost and time associated with testing and since the conventional Type-I and Type-II censoring schemes are not flexible enough, progressive censoring is developed by researchers. In this article, we develop a general goodness of fit test by using a new estimate of Kullback–Leibler information based on progressively Type-II censored data. Consistency and other properties of the proposed test are shown. Then, we use the proposed test statistic to test for exponentiality based on progressively Type-II censored data. The power values of the proposed test under different progressively Type-II censoring schemes are computed, through Monte Carlo simulations. It is observed that the proposed test is quite powerful in compared with the test proposed by Balakrishnan et al. (2007). Two real datasets from progressive censoring literature are finally presented for illustrative purpose. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
36. Ties in one block comparison experiments: a generalization of the Mallows-Bradley-Terry ranking model.
- Author
-
Sawadogo, Amadou, Dossou-Gbété, Simplice, and Lafon, Dominique
- Subjects
RANKING ,MAXIMUM likelihood statistics ,MONTE Carlo method ,MARKOV chain Monte Carlo ,MARKOV processes ,ASYMPTOTIC normality ,MATHEMATICAL models - Abstract
This study is concerned with the extension of the Mallows-Bradley-Terry ranking model for one block comparison consisting of all the items of interest to situations which allow an expression of no preference. We consider a modification of the Mallows-Bradley-Terry ranking model by introducing an additional parameter, called an index of discrimination, in the model. This permits ties in the model. The maximum likelihood estimates of the parameters are found using a Maximization-Minimization algorithm: the evaluation of the mathematical expectations involved in the log-likelihood equation is obtained by generating samples of Monte Carlo Markov chain from the stationary distribution. In addition, a simulation study for asymptotic properties assessment has been made. The proposed method is applied to analyze data election. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
37. Procedure to identify outliers through cumulative distribution of extremes in a Gamma response model.
- Author
-
Resende, Mariana, Brighenti, Carla Regina Guimarães, and Cirillo, Marcelo Ângelo
- Subjects
CUMULATIVE distribution function ,DISTRIBUTION (Probability theory) ,MONTE Carlo method ,MATHEMATICAL models ,PROBABILITY theory - Abstract
This work aimed at proposing a procedure based on the cumulative distribution of maximums and minimums to identify outliers in generalized Gamma-response models. In order to validate such method, we used simulations scenarios defined by the combination of different samples, contamination rate and distributions with different degrees of asymmetry. In this context, probabilities related to errors in classification and accuracy were obtained by carrying by Monte Carlo simulations. Using cumulative distribution of extremes to identify outliers in a Gamma-response model is recommended, since it is not likely to present errors and was highly accurate in all assessed scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
38. Merging data from multiple sources: pretest and shrinkage perspectives.
- Author
-
Shah, Muhammad Kashif Ali, Lisawadi, Supranee, and Ejaz Ahmed, S.
- Subjects
FINITE element method ,HYPOTHESIS ,MONTE Carlo method ,MATHEMATICAL models ,STATISTICS ,DISTRIBUTION (Probability theory) - Abstract
In this article, we have developed asymptotic theory for the simultaneous estimation of thekmeans of arbitrary populations under the common mean hypothesis and further assuming that corresponding population variances are unknown and unequal. The unrestricted estimator, the Graybill-Deal-type restricted estimator, the preliminary test, and the Stein-type shrinkage estimators are suggested. A large sample test statistic is also proposed as a pretest for testing the common mean hypothesis. Under the sequence of local alternatives and squared error loss, we have compared the asymptotic properties of the estimators by means of asymptotic distributional quadratic bias and risk. Comprehensive Monte-Carlo simulation experiments were conducted to study the relative risk performance of the estimators with reference to the unrestricted estimator in finite samples. Two real-data examples are also furnished to illustrate the application of the suggested estimation strategies. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
39. Heteroskedasticity-robust semi-parametric GMM estimation of a spatial model with space-varying coefficients.
- Author
-
Wei, Hongjie and Sun, Yan
- Subjects
HETEROSCEDASTICITY ,PARAMETRIC modeling ,SPATIAL variation ,AUTOREGRESSION (Statistics) ,MONTE Carlo method ,MATHEMATICAL models - Abstract
Copyright of Spatial Economic Analysis is the property of Routledge and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2017
- Full Text
- View/download PDF
40. Notes on estimation of the proportion ratio in the presence of carryover effects under the AB/BA crossover trial.
- Author
-
Lui, Kung-Jong
- Subjects
ESTIMATION theory ,ESTIMATION bias ,MONTE Carlo method ,MATHEMATICAL models ,STATISTICAL sampling ,STOCHASTIC processes - Abstract
The authors focus discussion on estimation of the proportion ratio (PR) in the presence of residual effects under the AB/BA design. Under a randomeffects multiplicative risk model,we develop three point estimators and three interval estimators accounting for residual effects for the PR. Using Monte Carlo simulations, we compare the performance of these point (or interval) estimators with point (or interval) estimators assuming no residual effects with respect to the bias and mean-squared-error (or the coverage probability and average length). The authors use the data taken from an AB/BA trial comparing two new inhalation devices to illustrate the use of these estimators. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
41. A fractionally integrated Wishart stochastic volatility model.
- Author
-
Asai, Manabu and McAleer, Michael
- Subjects
MARKET volatility ,DIFFUSION processes ,BROWNIAN motion ,GENERALIZED method of moments ,ASYMPTOTIC theory in econometrics ,ASYMPTOTIC distribution ,LAPLACE transformation ,MONTE Carlo method ,MATHEMATICAL models - Abstract
There has recently been growing interest in modeling and estimating alternative continuous time multivariate stochastic volatility models. We propose a continuous time fractionally integrated Wishart stochastic volatility (FIWSV) process, and derive the conditional Laplace transform of the FIWSV model in order to obtain a closed form expression of moments. A two-step procedure is used, namely estimating the parameter of fractional integration via the local Whittle estimator in the first step, and estimating the remaining parameters via the generalized method of moments in the second step. Monte Carlo results for the procedure show a reasonable performance in finite samples. The empirical results for the S&P 500 and FTSE 100 indexes show that the data favor the new FIWSV process rather than the one-factor and two-factor models of the Wishart autoregressive process for the covariance structure. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
42. US stock returns: are there seasons of excesses?
- Author
-
Bee, Marco, Dupuis, Debbie J., and Trapin, Luca
- Subjects
STOCKS (Finance) ,MONTE Carlo method ,ALGORITHM software ,CAPITAL market ,MATHEMATICAL models ,FINANCIAL markets - Abstract
This article explores the existence of seasonality in the tails of stock returns. We use a parametric model to describe the returns, and obtain a proxy of the innovation distribution via a pre-processing model. Then, we develop a change-point algorithm capturing changes in the tails of the innovations. We confirm the good performance of the procedure through extensive Monte Carlo experiments. An empirical investigation using US stocks data shows that while the lower tail of the innovations is approximately constant over the year, the upper tail is larger in Winter than in Summer, in 9 out of 12 industries. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
43. Estimation using hybrid censored data from a generalized inverted exponential distribution.
- Author
-
Tripathi, Yogesh Mani and Rastogi, Manoj Kumar
- Subjects
MONTE Carlo method ,DATA analysis ,ENTROPY ,GAMES of chance ,MATHEMATICAL models - Abstract
We consider point and interval estimation of the unknown parameters of a generalized inverted exponential distribution in the presence of hybrid censoring. The maximum likelihood estimates are obtained using EM algorithm. We then compute Fisher information matrix using the missing value principle. Bayes estimates are derived under squared error and general entropy loss functions. Furthermore, approximate Bayes estimates are obtained using Tierney and Kadane method as well as using importance sampling approach. Asymptotic and highest posterior density intervals are also constructed. Proposed estimates are compared numerically using Monte Carlo simulations and a real data set is analyzed for illustrative purposes. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
44. Statistical inference for Sobol pick-freeze Monte Carlo method.
- Author
-
Gamboa, F., Janon, A., Klein, T., Lagnoux, A., and Prieur, C.
- Subjects
INFERENTIAL statistics ,MONTE Carlo method ,MATHEMATICAL models ,SENSITIVITY analysis ,ESTIMATION theory - Abstract
Many mathematical models involve input parameters, which are not precisely known. Global sensitivity analysis aims to identify the parameters whose uncertainty has the largest impact on the variability of a quantity of interest (output of the model). One of the statistical tools used to quantify the influence of each input variable on the output is the Sobol sensitivity index. We consider the statistical estimation of this index from a finite sample of model outputs. We study asymptotic and non-asymptotic properties of two estimators of Sobol indices. These properties are applied to significance tests and estimation by confidence intervals. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
45. A stochastic optimization framework for the restoration of an over-exploited aquifer.
- Author
-
Mylopoulos, N. and Sidiropoulos, P.
- Subjects
AQUIFERS ,MATHEMATICAL models ,HYDRAULIC conductivity ,STOCHASTIC analysis ,MONTE Carlo method ,GROUNDWATER flow ,MATHEMATICAL optimization - Abstract
This study investigates the impact of hydraulic conductivity uncertainty on the sustainable management of the aquifer of Lake Karla, Greece, using the stochastic optimization approach. The lack of surface water resources in combination with the sharp increase in irrigation needs in the basin over the last 30 years have led to an unprecedented degradation of the aquifer. In addition, the lack of data regarding hydraulic conductivity in a heterogeneous aquifer leads to hydrogeologic uncertainty. This uncertainty has to be taken into consideration when developing the optimization procedure in order to achieve the aquifer’s sustainable management. Multiple Monte Carlo realizations of this spatially-distributed parameter are generated and groundwater flow is simulated for each one of them. The main goal of the sustainable management of the ‘depleted’ aquifer of Lake Karla is two-fold: to determine the optimum volume of renewable groundwater that can be extracted, while, at the same time, restoring its water table to a historic high level. A stochastic optimization problem is therefore formulated, based on the application of the optimization method for each of the aquifer’s multiple stochastic realizations in a future period. In order to carry out this stochastic optimization procedure, a modelling system consisting of a series of interlinked models was developed. The results show that the proposed stochastic optimization framework can be a very useful tool for estimating the impact of hydraulic conductivity uncertainty on the management strategies of a depleted aquifer restoration. They also prove that the optimization process is affected more by hydraulic conductivity uncertainty than the simulation process.Editor Z.W. Kundzewicz; Guest editor S. Weijs [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
46. An empirical Bayes' procedure for ranking players in Ryder Cup golf.
- Author
-
Baker, Rose D. and McHale, Ian G.
- Subjects
GOLFERS ,RYDER Cup (Golf tournament) ,GOLF tournaments ,MONTE Carlo method ,RANK correlation (Statistics) ,RANKING ,MATHEMATICAL models - Abstract
We describe a model to obtain strengths and rankings of players appearing in golf's Ryder Cup. Obtaining rankings is complicated because of two reasons. First, competitors do not compete on an equal number of occasions, with some competitors appearing too infrequently for their ranking to be estimated with any degree of certainty, and second, different competitors experience different levels of volatility in results. Our approach is to assume the competitor strengths are drawn from some common distribution. For small numbers of competitors, as is the case here, we fit the model using Monte-Carlo integration. Results suggest there is very little difference between the top performing players, though Scotland's Colin Montgomerie is estimated as the strongest Ryder Cup player. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
47. SUR Approach for IV Estimation of Canonical Contagion Models.
- Author
-
Shin, Dong Wan, Kim, Hyo Jin, and Seo, Jinwook
- Subjects
REGRESSION analysis ,ESTIMATION theory ,STATISTICAL correlation ,HETEROSCEDASTICITY ,MONTE Carlo method ,MATHEMATICAL models - Abstract
Seemingly unrelated regression (SUR) method is applied to the instrumental variable (IV) estimation of the canonical contagion models. A finite sample Monte Carlo experiment shows that the resulting estimator, IV-SUR estimator, is substantially better than the existing IV estimator in terms of both bias and mean squares error under diverse circumstance of instrument, conditional heteroscedasticity, and cross-section correlation. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
48. Is spurious behaviour an issue for two independent stationary spatial autoregressive SAR(1) processes?
- Author
-
Agiakloglou, Christos, Tsimbos, Cleon, and Tsimpanos, Apostolos
- Subjects
ECONOMETRICS ,SPATIAL analysis (Statistics) ,MONTE Carlo method ,AUTOCORRELATION (Statistics) ,ESTIMATION theory ,REGRESSION analysis ,MATHEMATICAL models - Abstract
Spurious regression occurs when two independent stationary or nonstationary time series are found to be correlated. Spurious behaviour is also detected in spatial data. Using a Monte Carlo analysis, this study examines the spurious phenomenon for two independent stationary spatial autoregressive processes of order one, that is, SAR(1), and it finds that when spatial econometric models are estimated, as suggested by the LM specification tests, the spurious behaviour is not detected nor the presence of spatially autocorrelated errors. [ABSTRACT FROM PUBLISHER]
- Published
- 2015
- Full Text
- View/download PDF
49. Rates of False-Positive Classification Resulting From the Analysis of Additional Embedded Performance Validity Measures.
- Author
-
Silk-Eglit, Graham M., Stenclik, Jessica H., Miele, Andrea S., Lynch, Julie K., and McCaffrey, Robert J.
- Subjects
NEUROPSYCHOLOGY ,FALSE positive error ,MONTE Carlo method ,COGNITIVE ability ,MATHEMATICAL models - Abstract
Several studies have documented improvements in the classification accuracy of performance validity tests (PVTs) when they are combined to form aggregated models. Fewer studies have evaluated the impact of aggregating additional PVTs and changing the classification threshold within these models. A recent Monte Carlo simulation demonstrated that to maintain a false-positive rate (FPR) of ≤.10, only 1, 4, 8, 10, and 15 PVTs should be analyzed at classification thresholds of failing at least 1, at least 2, at least 3, at least 4, and at least 5 PVTs, respectively. The current study sought to evaluate these findings with embedded PVTs in a sample of real-life litigants and to highlight a potential danger in analytic flexibility with embedded PVTs. Results demonstrated that to maintain an FPR of ≤.10, only 3, 7, 10, 14, and 15 PVTs should be analyzed at classification thresholds of failing at least 1, at least 2, at least 3, at least 4, and at least 5 PVTs, respectively. Analyzing more than these numbers of PVTs resulted in a dramatic increase in the FPR. In addition, in the most extreme case, flexibility in analyzing and reporting embedded PVTs increased the FPR by 67%. Given these findings, a more objective approach to analyzing and reporting embedded PVTs should be introduced. [ABSTRACT FROM PUBLISHER]
- Published
- 2015
- Full Text
- View/download PDF
50. Crystal nuclei in melts: a Monte Carlo simulation of a model for attractive colloids.
- Author
-
Statt, Antonia, Virnau, Peter, and Binder, Kurt
- Subjects
COLLOIDAL suspensions ,PARTICLES (Nuclear physics) ,MONTE Carlo method ,NUCLEATION ,MATHEMATICAL models ,THERMODYNAMICS - Abstract
As a model for a suspension of hard-sphere-like colloidal particles where small non-adsorbing dissolved polymers create a depletion attraction, we introduce an effective colloid–colloid potential closely related to the Asakura–Oosawa model, but that does not have any discontinuities. In simulations, this model straightforwardly allows the calculation of the pressure from the virial formula, and the phase transition in the bulk from the liquid to crystalline solid can be accurately located from a study where a stable coexistence of a crystalline slab with a surrounding liquid phase occurs. For this model, crystalline nuclei surrounded by fluid are studied both by identifying the crystal–fluid interface on the particle level (using suitable bond orientational order parameters to distinguish the phases) and by ‘thermodynamic’ means, i.e. the latter method amounts to compute the enhancement of chemical potential and pressure relative to their coexistence values. We show that the chemical potential can be obtained from simulating thick films, where one wall with a rather long-range repulsion is present, since near this wall, the Widom particle insertion method works, exploiting the fact that the chemical potential in the system is homogeneous. Finally, the surface excess free energy of the nucleus is obtained, for a wide range of nuclei volumes. From this method, it is established that classical nucleation theory works, showing that for the present model, the anisotropy of the interface excess free energy of crystals and their resulting non-spherical shape has only a very small effect on the barrier. [ABSTRACT FROM PUBLISHER]
- Published
- 2015
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.