1,275 results on '"Non-linear least squares"'
Search Results
2. Accounting for Skewed or One-Sided Measurement Error in the Dependent Variable
- Author
-
Christopher F. Parmeter and Daniel L. Millimet
- Subjects
021110 strategic, defence & security studies ,Heteroscedasticity ,Observational error ,Variables ,Sociology and Political Science ,media_common.quotation_subject ,05 social sciences ,0211 other engineering and technologies ,Inference ,02 engineering and technology ,0506 political science ,Power (physics) ,Non-linear least squares ,Political Science and International Relations ,Linear regression ,Statistics ,050602 political science & public administration ,Feature (machine learning) ,media_common - Abstract
While classical measurement error in the dependent variable in a linear regression framework results only in a loss of precision, nonclassical measurement error can lead to estimates, which are biased and inference which lacks power. Here, we consider a particular type of nonclassical measurement error: skewed errors. Unfortunately, skewed measurement error is likely to be a relatively common feature of many outcomes of interest in political science research. This study highlights the bias that can result even from relatively “small” amounts of skewed measurement error, particularly, if the measurement error is heteroskedastic. We also assess potential solutions to this problem, focusing on the stochastic frontier model and Nonlinear Least Squares. Simulations and three replications highlight the importance of thinking carefully about skewed measurement error as well as appropriate solutions.
- Published
- 2021
3. Common correlated effect cross‐sectional dependence corrections for nonlinear conditional mean panel models
- Author
-
Sinem Hacioglu Hoke and George Kapetanios
- Subjects
Economics and Econometrics ,05 social sciences ,Monte Carlo method ,Asymptotic distribution ,Estimator ,Conditional expectation ,Nonlinear system ,Consistency (statistics) ,Non-linear least squares ,0502 economics and business ,Statistics ,Econometrics ,050207 economics ,Social Sciences (miscellaneous) ,050205 econometrics ,Panel data ,Mathematics - Abstract
This paper provides an approach to estimation and inference for nonlinear conditional mean panel data models, in the presence of cross‐sectional dependence. We modify Pesaran's (Econometrica, 2006, 74(4), 967–1012) common correlated effects correction to filter out the interactive unobserved multifactor structure. The estimation can be carried out using nonlinear least squares, by augmenting the set of explanatory variables with cross‐sectional averages of both linear and nonlinear terms. We propose pooled and mean group estimators, derive their asymptotic distributions, and show the consistency and asymptotic normality of the coefficients of the model. The features of the proposed estimators are investigated through extensive Monte Carlo experiments. We also present two empirical exercises. The first explores the nonlinear relationship between banks' capital ratios and riskiness. The second estimates the nonlinear effect of national savings on national investment in OECD countries depending on countries' openness.
- Published
- 2020
4. Prediksi Akhir Pandemi COVID-19 di Indonesia dengan Simulasi Berbasis Model Pertumbuhan Parametrik
- Author
-
Fransiscus Rian Pratikto
- Subjects
Model parameter ,lcsh:T55.4-60.8 ,Coronavirus disease 2019 (COVID-19) ,Non-linear least squares ,Statistics ,Gompertz function ,Population data ,lcsh:Industrial engineering. Management engineering ,Logistic function ,Confidence interval ,Parametric statistics ,Mathematics - Abstract
This research aims to predict the end of the COVID-19 pandemic in Indonesia based on parametric growth models. The models are chosen by considering their fitness with the data of Taiwan which is believed to have passed over the peak of the pandemic and have gone through all phases in the growth curves. The models are parameterized using the nonlinear least squares method. The deviation and confidence interval of each parameter is estimated using the k-fold cross-validation and the bootstrap techniques. Using the total cases per million population data from March 2 to June 18, 2020, it was found that two growth models fit the data, i.e. logistic and modified Gompertz, where the latter performs better. Using the information about the deviation of each model parameter, a simulation model is developed to predict the time at which the total cases curve starts to flatten, which is an indication of the end of the pandemic. It was found with 95% confidence level that based on the modified Gompertz model the pandemic will end somewhere between March 9 – September 7, 2021 with total cases per million of 206 - 555. Meanwhile, based on the logistic growth model, the end of the pandemic is between August 28 – September 23, 2020 with total cases per million of 180 - 375. This model can be extended by making comparative scenario with Taiwan based on measures that represent the quality of the pandemic mitigation such as test ratio and the intensity of social restriction.
- Published
- 2020
5. Comparison of three different statistical approaches (non-linear least-squares regression, survival analysis and Bayesian inference) in their usefulness for estimating hydrothermal time models of seed germination
- Author
-
Shirin Sharifiamina, Elena Moltchanova, Derrick J. Moot, Mark Bloomberg, and Ali Shayanfar
- Subjects
0106 biological sciences ,Generalized linear model ,education.field_of_study ,Bayesian probability ,Population ,04 agricultural and veterinary sciences ,Plant Science ,Bayesian inference ,01 natural sciences ,Censoring (statistics) ,Regression ,Non-linear least squares ,Linear regression ,Statistics ,040103 agronomy & agriculture ,0401 agriculture, forestry, and fisheries ,education ,010606 plant biology & botany ,Mathematics - Abstract
Hydrothermal time (HTT) models describe the time course of seed germination for a population of seeds under specific temperature and water potential conditions. The parameters of the HTT model are usually estimated using either a linear regression, non-linear least squares estimation or a generalized linear regression model. There are problems with these approaches, including loss of information, and censoring and lack of independence in the germination data. Model estimation may require optimization, and this can have a heavy computational burden. Here, we compare non-linear regression with survival and Bayesian methods, to estimate HTT models for germination of two clover species. All three methods estimated similar HTT model parameters with similar root mean squared errors. However, the Bayesian approach allowed (1) efficient estimation of model parameters without the need for computation-intensive methods and (2) easy comparison of HTT parameters for the two clover species. HTT models that accounted for a species effect were superior to those that did not. Inspection of credibility intervals and estimated posterior distributions for the Bayesian HTT model shows that it is credible that most HTT model parameters were different for the two clover species, and these differences were consistent with known biological differences between species in their germination behaviour.
- Published
- 2020
6. Estimation of semiparametric varying-coefficient spatial autoregressive models with missing in the dependent variable
- Author
-
Mixia Wu, Zhen Pang, and Guowang Luo
- Subjects
Statistics and Probability ,Variables ,media_common.quotation_subject ,05 social sciences ,Asymptotic distribution ,Estimator ,Bayesian inference ,Missing data ,01 natural sciences ,010104 statistics & probability ,Autoregressive model ,Non-linear least squares ,0502 economics and business ,Statistics ,Imputation (statistics) ,0101 mathematics ,050205 econometrics ,Mathematics ,media_common - Abstract
This paper investigates estimation of semiparametric varying-coefficient spatial autoregressive models in which the dependent variable is missing at random. An inverse propensity score weighted sieve two-stage least squares (S-2SLS) estimation with imputation is proposed. The proposed estimators are shown to be consistent, no matter the initial value is taken as the naive S-2SLS estimate or the naive nonlinear least squares estimate, and the asymptotic distribution of the latter is also derived. Simulation studies are carried out to investigate the performance of the proposed estimator. The method is finally exemplified with one real data set on Boston housing prices.
- Published
- 2020
7. Predicting crown width and length using nonlinear mixed-effects models: a test of competition measures using Chinese fir (Cunninghamia lanceolata (Lamb.) Hook.)
- Author
-
Zhengyang Hou, Jinghui Meng, Fangxing Ge, and Wenwen Wang
- Subjects
Ecology ,biology ,media_common.quotation_subject ,Crown (botany) ,Sampling (statistics) ,Forestry ,Random effects model ,biology.organism_classification ,Competition (biology) ,Nonlinear system ,Non-linear least squares ,Statistics ,Cunninghamia ,Linear least squares ,Mathematics ,media_common - Abstract
Including individual-tree competition indices as predictor variables could significantly improve the performance of crown width and length models for Chinese fir ( Cunninghamia lanceolata (Lamb.) Hook.). Moreover, distance-dependent competition indices are superior to distance-independent ones when modeling crown width and length. Compared with crown width and length basic models with optimum competition indices, the performance of the two-level nonlinear mixed-effects models improved. Crown width (CW) and crown length (CL) are two important variables widely included as the predictors in growth and yield models that contribute to forest management strategies. Individual-tree crown width and length models were developed with data from 1498 Chinese fir (Cunninghamia lanceolata (Lamb.) Hook.) trees in 16 sample plots located at Jiangle County, Fujian Province, southeastern China. Two hypotheses were proposed: (1) including individual-tree competition indices as predictor variables could significantly improve performance of both the CW—DBH and CL—DBH models; and (2) the distance-dependent competition indices would perform better than distance-independent ones. The models were fitted using generalized linear least squares or generalized nonlinear least squares methods. In addition, to prevent correlations between observations from the same sampling unit, we introduced age classes and sample plots as random effects to develop the two-level nonlinear mixed-effects models. We found introduction of competition indices could significantly improve the performance of the CW—DBH and CL—DBH models. The distance-dependent competition index (i.e., competitor to subject tree distance) performed best in modeling the crown width and length models. Compared with crown width and length basic models with optimum competition indices, the performance of the two-level nonlinear mixed-effects models was significantly better. The two hypotheses were accepted. We hope these models will contribute to scientific management of Chinese fir plantations.
- Published
- 2021
8. Gibbs Sampling for Bayesian Prediction of SARMA Processes
- Author
-
Ayman Amin
- Subjects
Statistics and Probability ,Statistics ,Bayesian probability ,37M10 ,Inverse ,Multivariate normal distribution ,Management Science and Operations Research ,Conjugate prior ,Statistics::Computation ,symbols.namesake ,Multiplicative SARMA models ,Posterior analysis ,Predictive analysis ,MCMC methods ,Gibbs sampler ,Modeling and Simulation ,Non-linear least squares ,symbols ,Applied mathematics ,Autoregressive–moving-average model ,Statistics, Probability and Uncertainty ,Likelihood function ,Gibbs sampling ,Mathematics - Abstract
In this article we present a Bayesian prediction of multiplicative seasonal autoregressive moving average (SARMA) processes using the Gibbs sampling algorithm. First, we estimate the unobserved errors using the nonlinear least squares (NLS) method to approximate the likelihood function. Second, we employ conjugate priors on the model parameters and initial values and assume the model errors are normally distributed to derive the conditional posterior and predictive distributions. In particular, we show that the conditional posterior distribution of the model parameters and the variance are multivariate normal and inverse gamma respectively, and the conditional predictive distribution of the future observations is a multivariate normal. Finally, we use these closed-form conditional posterior and predictive distributions to apply the Gibbs sampling algorithm to approximate empirically the marginal posterior and predictive distributions, enabling us easily to carry out multiple-step ahead predictions. We evaluate our proposed Bayesian method using simulation study and real-world time series datasets.
- Published
- 2019
9. Density management diagram for mixed-species forests in the El Salto region,Durango, Mexico
- Author
-
Gerónimo Quiñonez-Barraza, Juan Abel Nájera-Luna, Francisco Cruz-Cobos, Víctor H. Calderón-Leal, Reyna S. Cabrera-Pérez, and Sacramento Corral-Rivas
- Subjects
0106 biological sciences ,Coefficient of determination ,Ecology ,Thinning ,Mean squared error ,Diagram ,Sampling (statistics) ,Forestry ,010603 evolutionary biology ,01 natural sciences ,Regression ,010602 entomology ,Non-linear least squares ,Statistics ,Maximum density ,Mathematics - Abstract
Introduction: Density management diagrams (DMDs) are useful tools for characterizing andmanaging stand density. Objective: To develop a DMD to schedule thinnings in the natural mixed-species forests of theEl Salto region, Durango. Materials and methods: The data were collected in 441 temporary sampling plots in 263 mixed-species stands with mainly species of the Pinus and Quercus genus. The DMD was based on theHart-Becking index and a relationship of two allometric equations: 1) the quadratic meandiameter (dg, cm) with the density (N, trees·ha−1) and dominant height (Hd, m), and 2) the volume(V, m3·ha−1) with the dg, Hd and N. In fitting equations, the ordinary Nonlinear Least Squares(NLS) method was used simultaneously. The maximum density limit was estimated by potentialquantile regression that related N to Hd. Results and discussion: Efficient goodness-of-fit statistics were reported in the fitted models, interms of Root Mean Square Error (2.29) and coefficient of determination (0.86). The DMDsuggests applying thinnings below the maximum density line to avoid mortality. Through theDMD it is possible to evaluate different silvicultural alternatives, schedule thinnings, maximizegrowth space, promote tree growth and improve forest products. Conclusion: The DMD developed is useful for thinning scheduling to obtain saw-timber atrotation age
- Published
- 2018
10. Statistical modeling of the novel COVID-19 epidemic in Iraq
- Author
-
Al-Ani Ban Ghanim
- Subjects
Mean squared error ,Epidemiology ,Applied Mathematics ,030231 tropical medicine ,Gompertz function ,Normal distribution ,03 medical and health sciences ,0302 clinical medicine ,Non-linear least squares ,Statistics ,Range (statistics) ,030212 general & internal medicine ,Simple linear regression ,Nonlinear regression ,Mathematics ,Weibull distribution - Abstract
Objectives This study aimed to apply three of the most important nonlinear growth models (Gompertz, Richards, and Weibull) to study the daily cumulative number of COVID-19 cases in Iraq during the period from 13th of March, 2020 to 22nd of July, 2020. Methods Using the nonlinear least squares method, the three growth models were estimated in addition to calculating some related measures in this study using the “nonlinear regression” tool available in Minitab-17, and the initial values of the parameters were deduced from the transformation to the simple linear regression equation. Comparison of these models was made using some statistics (F-test, AIC, BIC, AICc and WIC). Results The results indicate that the Weibull model is the best adequate model for studying the cumulative daily number of COVID-19 cases in Iraq according to some criteria such as having the highest F and lowest values for RMSE, bias, MAE, AIC, BIC, AICc and WIC with no any violations of the assumptions for the model’s residuals (independent, normal distribution and homogeneity variance). The overall model test and tests of the estimated parameters showed that the Weibull model was statistically significant for describing the study data. Conclusions From the Weibull model predictions, the number of cumulative confirmed cases of novel coronavirus in Iraq will increase by a range of 101,396 (95% PI: 99,989 to 102,923) to 114,907 (95% PI: 112,251 to 117,566) in the next 24 days (23rd of July to 15th of August 15, 2020). From the inflection points in the Weibull curve, the peak date when the growth rate will be maximum, is 7th of July, 2020, and at this time the daily cumulative cases become 67,338. Using the nonlinear least squares method, the models were estimated and some related measures were calculated in this study using the “nonlinear regression” tool available in Minitab-17, and the initial values of the parameters were obtained from the transformation to the simple linear regression model.
- Published
- 2021
11. Prediction of Individual Tree Diameter and Height to Crown Base Using Nonlinear Simultaneous Regression and Airborne LiDAR Data
- Author
-
Guangxing Wang, Ram P. Sharma, Huiru Zhang, Liyong Fu, Zhaohui Yang, Qiaolin Ye, Qingwang Liu, Peng Luo, and Guangshuang Duan
- Subjects
Observational error ,leave-one-out cross-validation ,010504 meteorology & atmospheric sciences ,0211 other engineering and technologies ,Diameter at breast height ,error-in-variable modeling ,02 engineering and technology ,Seemingly unrelated regressions ,01 natural sciences ,Regression ,Cross-validation ,Simultaneous equations ,Picea crassifolia Kom ,Non-linear least squares ,Statistics ,compatible equation ,General Earth and Planetary Sciences ,lcsh:Q ,nonlinear seemingly unrelated regression ,lcsh:Science ,Predictive modelling ,021101 geological & geomatics engineering ,0105 earth and related environmental sciences ,Mathematics - Abstract
The forest growth and yield models, which are used as important decision-support tools in forest management, are commonly based on the individual tree characteristics, such as diameter at breast height (DBH), crown ratio, and height to crown base (HCB). Taking direct measurements for DBH and HCB through the ground-based methods is cumbersome and costly. The indirect method of getting such information is possible from remote sensing databases, which can be used to build DBH and HCB prediction models. The DBH and HCB of the same trees are significantly correlated, and so their inherent correlations need to be appropriately accounted for in the DBH and HCB models. However, all the existing DBH and HCB models, including models based on light detection and ranging (LiDAR) have ignored such correlations and thus failed to account for the compatibility of DBH and HCB estimates, in addition to disregarding measurement errors. To address these problems, we developed a compatible simultaneous equation system of DBH and HCB error-in-variable (EIV) models using LiDAR-derived data and ground-measurements for 510 Picea crassifolia Kom trees in northwest China. Four versatile algorithms, such as nonlinear seemingly unrelated regression (NSUR), two-stage least square (2SLS) regression, three-stage least square (3SLS) regression, and full information maximum likelihood (FIML) were evaluated for their estimating efficiencies and precisions for a simultaneous equation system of DBH and HCB EIV models. In addition, two other model structures, namely, nonlinear least squares with HCB estimation not based on the DBH (NLS and NBD) and nonlinear least squares with HCB estimation based on the DBH (NLS and BD) were also developed, and their fitting precisions with a simultaneous equation system compared. The leave-one-out cross-validation method was applied to evaluate all estimating algorithms and their resulting models. We found that only the simultaneous equation system could illustrate the effect of errors associated with the regressors on the response variables (DBH and HCB) and guaranteed the compatibility between the DBH and HCB models at an individual level. In addition, such an established system also effectively accounted for the inherent correlations between DBH with HCB. However, both the NLS and BD model and the NLS and NBD model did not show these properties. The precision of a simultaneous equation system developed using NSUR appeared the best among all the evaluated algorithms. Our equation system does not require the stand-level information as input, but it does require the information of tree height, crown width, and crown projection area, all of which can be readily derived from LiDAR imagery using the delineation algorithms and ground-based DBH measurements. Our results indicate that NSUR is a more reliable and quicker algorithm for developing DBH and HCB models using large scale LiDAR-based datasets. The novelty of this study is that the compatibility problem of the DBH model and the HCB EIV model was properly addressed, and the potential algorithms were compared to choose the most suitable one (NSUR). The presented method and algorithm will be useful for establishing similar compatible equation systems of tree DBH and HCB EIV models for other tree species.
- Published
- 2020
12. Gilliland's Correlation: A Case Study in Regression Analysis
- Author
-
Richard A. Davis
- Subjects
Correlation ,Fractionating column ,General Chemical Engineering ,Non-linear least squares ,Statistics ,Regression analysis ,General Chemistry ,Studentized residual ,Regression ,Mathematics - Abstract
A case study of regression analysis based on modeling Gilliland’s correlation was described for use in a computational methods course. The case study uses a familiar example to train students in nonlinear least squares regression and to use standardized residual plots for model assessment. Previously published equations for Gilliland’s correlation were reviewed improved by refitting Gilliland’s data using nonlinear least squares regression. A new two-parameter rational equation was found to be superior to previously reported Gilliland equations as the only model that meets all the theoretical end conditions without compromise. The new and improved equation for Gilliland’s correlation is recommended for preliminary shortcut methods of distillation column design and analysis.
- Published
- 2020
13. Uncertainty in multi-scale fatigue life modeling and a new approach to estimating frequency of in-service inspection of aging components
- Author
-
Jeffrey T. Fong, James J. Filliben, Stephen W. Freiman, and N. Alan Heckert
- Subjects
Computer science ,Mechanical Engineering ,0211 other engineering and technologies ,Prediction interval ,02 engineering and technology ,Condensed Matter Physics ,Fatigue limit ,Article ,Plot (graphics) ,020303 mechanical engineering & transports ,0203 mechanical engineering ,Mechanics of Materials ,Component (UML) ,Non-linear least squares ,021105 building & construction ,Statistics ,Tolerance interval ,Statistical theory ,Uncertainty quantification - Abstract
Uncertainty in modeling the fatigue life of a full-scale component using experimental data at microscopic (Level 1), specimen (Level 2), and full-size (Level 3) scales, is addressed by applying statistical theory of prediction intervals, and that of tolerance intervals based on the concept of coverage, p. Using a nonlinear least squares fit algorithm and the physical assumption that the one-sided Lower Tolerance Limit (LTL), at 95% confidence level, of the fatigue life, i.e., the minimum cycles-to-failure, minNf, of a full-scale component, cannot be negative as the lack or “Failure” of coverage (Fp), defined as 1 − p, approaches zero, we develop a new fatigue life model, where the minimum cycles-to-failure, minNf, at extremely low “Failure” of coverage, Fp, can be estimated. Since the concept of coverage is closely related to that of an inspection strategy, and if one assumes that the predominent cause of failure of a full-size component is due to the “Failure” of inspection or coverage, it is reasonable to equate the quantity, Fp, to a Failure Probability, FP, thereby leading to a new approach of estimating the frequency of in-service inspection of a full-size component. To illustrate this approach, we include a numerical example using the published data of the fatigue of an AISI 4340 steel (N.E. Dowling, Journal of Testing and Evaluation, ASTM, Vol. 1(4) (1973), 271–287) and a linear least squares fit to generate the necessary uncertainties for performing a dynamic risk analysis, where a graphical plot of an estimate of risk with uncertainty vs. a predicted most likely date of a high consequence failure event becomes available. In addition, a nonlinear least squares logistic function fit of the fatigue data yields a prediction of the statistical distribution of both the ultimate strength and the endurance limit.
- Published
- 2018
14. Segmented concave least squares: A nonparametric piecewise linear regression
- Author
-
Abolfazl Keshvari
- Subjects
Computer Science::Computer Science and Game Theory ,Concave least squares ,Information Systems and Management ,General Computer Science ,0211 other engineering and technologies ,Hedonic pricing ,02 engineering and technology ,Generalized least squares ,Management Science and Operations Research ,Decision analysis ,Least squares ,Industrial and Manufacturing Engineering ,Piecewise linear function ,0502 economics and business ,Statistics ,Econometrics ,050207 economics ,Segmented regression ,Total least squares ,ta512 ,Finland ,Mathematics ,Ordinary least squares ,021103 operations research ,05 social sciences ,Computer Science::Computers and Society ,Modeling and Simulation ,Non-linear least squares ,Simple linear regression - Abstract
In this paper, segmented concave least squares (SCLS) is introduced. SCLS is a nonparametric piecewise linear regression problem in which the estimated function is (monotonic) concave and the number of linear segments (k) is pre-specified. Ordinary least squares (k = 1) and concave least squares (k = n, the number of observations) are two extreme cases of this problem. An application of SCLS is to estimate a hedonic function. Using this method, observations are categorized into k groups and a piecewise linear hedonic function is estimated such that there is one linear segment for every group. The estimated hedonic function holds the principle of diminishing marginal utility. In this paper, SCLS is used to categorize hotels in Finland into three groups. A trade-off between the number of groups and the goodness of fit measure is used to determine the number of groups. Based on the similarities of the pricing methods, hotels in the sample are endogenously classified and the shadow prices for each group are calculated. The results reveal that the hotels do not value hotel attributes similarly and there are significant differences among groups. Hedonic pricing model via SCLS provides a novel categorization of hotels that cannot be obtained by using ordinary least squares.
- Published
- 2018
15. Predicción de la altura en plantaciones brasileñas de Khaya ivorensis
- Author
-
Antonio Carlos Ferraz-Filho, José Roberto Soares-Scolforo, and Andressa Ribeiro
- Subjects
0106 biological sciences ,Heteroscedasticity ,Forest inventory ,Data collection ,Forest management ,Forestry ,04 agricultural and veterinary sciences ,01 natural sciences ,Plot (graphics) ,African mahogany ,010602 entomology ,Tree (data structure) ,Variable (computer science) ,Non-linear least squares ,Statistics ,040103 agronomy & agriculture ,statistical modelling ,0401 agriculture, forestry, and fisheries ,forest inventory ,Mathematics - Abstract
Tree height measurement is one of the most difficult activities in forest inventory data gathering, although it is a fundamental variable to support forest management, since it is an input for modelling growth and yield. To overcome this obstacle and ensure that the heights of trees are estimated accurately, hypsometric relationships are used. Therefore, the objective of this study was to compare different fitting strategies (i.e. nonlinear least squares and mixed-effects) to predict tree height in African mahogany Brazilian plantations using well know local (using only tree height and diameter) and generalized (using height, diameter and plot level variables) models. Data were gathered on 149 permanent plots sampled in different Brazilian regions and ages, totaling 4,201 height-diameter pairs. Different models were evaluated and the best method to estimate the height-diameter relationship was based on statistical and graphical criteria. A local model using mixed-effects with correction of heteroscedasticity was efficient and superior to other models evaluated. However, when using an independent data base, the generalized model fitted by nonlinear least squares generates adequate results that are scaled to the plots’ productivity, since the inclusion of dominant height into the model helps to predict height locally., La medición de la altura del árbol es de difícil realización en inventarios forestales, aunque es una variable fundamental para apoyar el manejo forestal una vez que es dato de entrada para la modelación del crecimiento y producción. Para superar este obstáculo y garantizar un cálculo de alturas de los árboles con precisión se utiliza la relación hipsométrica. Por lo tanto, el objetivo de este estudio fue comparar diferentes estrategias de ajuste (mínimos cuadrados no lineales e efecto mixto) para predecir la altura de los árboles en plantaciones brasileñas de caoba africana (Khaya ivorensis) utilizando conocidos modelos locales (apenas diámetro e altura) y generalizados (diámetro, altura y variables de la parcela). Los datos fueron recogidos en 149 parcelas permanentes muestreadas en diferentes regiones brasileñas y edades, totalizando 4.201 pares de altura-diámetro. Diferentes modelos fueron evaluados y el mejor método para estimar la relación altura-diámetro se basó en los criterios estadísticos y gráficos. El modelo local usando efectos mixtos con la corrección de heterocedasticidad fue eficiente y superior a otros modelos evaluados. Sin embargo, cuando se utiliza una base de datos independiente, el modelo generalizado ajustado por mínimos cuadrados no lineales genera resultados adecuados que se ajustan a la productividad de las parcelas, ya que la inclusión de la altura dominante en el modelo ayuda a predecir la altura a nivel local.
- Published
- 2018
16. Evaluation of four regression techniques for stem taper modeling of Dahurian larch (Larix gmelinii) in Northeastern China
- Author
-
Amna Hussain, Fengri Li, Pei He, Lichun Jiang, and Muhammad Khurram Shahzad
- Subjects
0106 biological sciences ,Larix gmelinii ,Box plot ,biology ,Calibration (statistics) ,Generalized additive model ,Forestry ,Management, Monitoring, Policy and Law ,biology.organism_classification ,010603 evolutionary biology ,01 natural sciences ,Regression ,Non-linear least squares ,Forest ecology ,Statistics ,Larch ,010606 plant biology & botany ,Nature and Landscape Conservation ,Mathematics - Abstract
Estimating stem volume and biomass in forests is fundamental to both economic and ecological assessments of forest ecosystem structure and function. Stem taper models were widely used to calculate stem volume and biomass, but it can be challenging to get an accurate and convenient technique for taper models in practice. This study evaluated ordinary nonlinear least squares (ONLS), fixed-effects model (FIXED), quantile regression (QR), and generalized additive model (GAM) to predict tree diameter, volume, and merchantable height of Dahurian larch (Larix gmelinii) in Northeastern China. As far as we know, a comprehensive analysis of these four techniques is limited for taper data. Therefore, our main objectives were to compare these four techniques at an equitable level without calibration and select a single and widely applied technique for the taper model. The dataset comprising 1372 felled-trees from Dahurian larch natural forest were used to evaluate the techniques with a leave-one-out cross-validation approach. Evaluation statistics and box plots showed that the GAM performed better than other techniques for stem profile description and volume estimation. Results also revealed that all techniques had a bias in estimating merchantable height. However, this limitation does not significantly affect the overall performance and applied use of the GAM for diameter and volume prediction. When intuitive interpretations are not needed, the GAM can serve as an accurate and convenient technique for prediction.
- Published
- 2021
17. Implementation of reduced-order physics-based model and multi-parameters identification strategy for lithium-ion battery
- Author
-
Lin Yang, Xiaowei Zhao, Hao Deng, Yishan Cai, and Zhongwei Deng
- Subjects
Physics ,Battery (electricity) ,Computational complexity theory ,020209 energy ,Mechanical Engineering ,02 engineering and technology ,Building and Construction ,021001 nanoscience & nanotechnology ,Pollution ,Industrial and Manufacturing Engineering ,symbols.namesake ,General Energy ,Non-linear least squares ,Statistics ,0202 electrical engineering, electronic engineering, information engineering ,symbols ,Equivalent circuit ,Electrical and Electronic Engineering ,0210 nano-technology ,Fisher information ,Algorithm ,Condition number ,Civil and Structural Engineering ,Confidence region ,Voltage - Abstract
Physics-based models for lithium-ion battery have been regarded as a promising alternative to equivalent circuit models due to their ability to describe internal electrochemical states of battery. However, the huge computational burden and numerous parameters of these models impede their application in embedded battery management system. To deal with the above problem, a reduced-order physics-based model for lithium-ion battery with better tradeoff between the model fidelity and computational complexity is developed. A strategy is proposed to extend the operation from a fixed point to full state of charge range. As the model consists of constant, varying, identifiable and unidentifiable parameters, it is impractical to identify the full set of parameters only using the current-voltage data. To sort out the identifiable parameters, a criterion based on calculating the determinant and condition number of Fisher information matrix (FIM) is employed. A subset with maximum nine identifiable parameters is obtained and then identified by nonlinear least square regression algorithm with confidence region calculated by FIM. Compared with the outputs from commercial software, the effectiveness of the battery model and extending strategy are verified. The estimated parameters deviate from the true values slightly, and produce small voltage errors at different current profiles.
- Published
- 2017
18. A comparison of Bayesian and classical methods for parameter estimation in greenhouse crop models
- Author
-
R. Salazar-Moreno, E. Fitz-Rodríguez, Agustín Ruiz-García, Irineo L. López-Cruz, and Abraham Rojano-Aguilar
- Subjects
Metropolis–Hastings algorithm ,Estimation theory ,Non-linear least squares ,Bayesian probability ,Statistics ,Greenhouse ,Horticulture ,Greenhouse crops ,Least squares ,Importance sampling ,Mathematics - Published
- 2017
19. MULTIVARIATE THREE-STAGE LEAST SQUARES FIXED EFFECT PANEL SIMULTANEOUS MODELS AND ESTIMATION OF THEIR PARAMETERS
- Author
-
I Wayan Mangku, Abuzar Asra, Hermanto Siregar, I Made Sumertajaya, and Timbang Sirait
- Subjects
Estimation ,Multivariate statistics ,General Mathematics ,Fixed effects model ,Generalized least squares ,Least squares ,Simultaneous equations model ,03 medical and health sciences ,0302 clinical medicine ,030220 oncology & carcinogenesis ,Non-linear least squares ,Statistics ,030211 gastroenterology & hepatology ,Total least squares ,Mathematics - Published
- 2017
20. Prediction from Uncertain Inputs for Partial Least Squares Regression
- Author
-
Jianghong Ren, Zehui Chen, Weiguo Zhang, and Shaowei Gong
- Subjects
Residual sum of squares ,Control and Systems Engineering ,Non-linear least squares ,Statistics ,Partial least squares regression ,Econometrics ,Explained sum of squares ,Least trimmed squares ,Generalized least squares ,Simple linear regression ,Total least squares ,Mathematics - Published
- 2017
21. Optical Flow Estimation Using Total Least Squares Variants
- Author
-
Maria De Jesus and Vania V. Estrela
- Subjects
010504 meteorology & atmospheric sciences ,Total sum of squares ,Computer science ,Least trimmed squares ,Generalized least squares ,010502 geochemistry & geophysics ,01 natural sciences ,Iteratively reweighted least squares ,Optical flow estimation ,Residual sum of squares ,Non-linear least squares ,Statistics ,Applied mathematics ,Total least squares ,0105 earth and related environmental sciences - Abstract
The problem of recursively approximating motion resulting from the Optical Flow (OF) in video thru Total Least Squares (TLS) techniques is addressed. TLS method solves an inconsistent system Gu=z , with G and z in error due to temporal/spatial derivatives, and nonlinearity, while the Ordinary Least Squares (OLS) model has noise only in z. Sources of difficulty involve the non-stationarity of the field, the ill-posedness, and the existence of noise in the data. Three ways of applying the TLS with different noise conjectures to the end problem are observed. First, the classical TLS (cTLS) is introduced, where the entries of the error matrices of each row of the augmented matrix [G;z] have zero mean and the same standard deviation. Next, the Generalized Total Least Squares (GTLS) is defined to provide a more stable solution, but it still has some problems. The Generalized Scaled TLS (GSTLS) has G and z tainted by different sources of additive zero-mean Gaussian noise and scaling [G;z] by nonsingular D and E, that is, D[G;z] E makes the errors iid with zero mean and a diagonal covariance matrix. The scaling is computed from some knowledge on the error distribution to improve the GTLS estimate. For moderate levels of additive noise, GSTLS outperforms the OLS, and the GTLS approaches. Although any TLS variant requires more computations than the OLS, it is still applicable with proper scaling of the data matrix.
- Published
- 2017
22. Comparing cross-country estimates of Lorenz curves using a Dirichlet distribution across estimators and datasets
- Author
-
Andrew C. Chang, Phillip Li, and Shawn M. Martin
- Subjects
Economics and Econometrics ,Gini coefficient ,05 social sciences ,Estimator ,Economic statistics ,Dirichlet distribution ,symbols.namesake ,Income distribution ,Generalized Dirichlet distribution ,Non-linear least squares ,0502 economics and business ,Statistics ,Econometrics ,symbols ,050207 economics ,Lorenz curve ,Social Sciences (miscellaneous) ,050205 econometrics ,Mathematics - Abstract
Summary Chotikapanich and Griffiths (Journal of Business and Economic Statistics, 2002, 20(2), 290–295) introduced the Dirichlet distribution to the estimation of Lorenz curves. This distribution naturally accommodates the proportional nature of income share data and the dependence structure between the shares. Chotikapanich and Griffiths fit a family of five Lorenz curves to one year of Swedish and Brazilian income share data using unconstrained maximum likelihood and unconstrained nonlinear least squares. We attempt to replicate the authors' results and extend their analyses using both constrained estimation techniques and five additional years of data. We successfully replicate a majority of the authors' results and find that some of their main qualitative conclusions also hold using our constrained estimators and additional data.
- Published
- 2017
23. Allometric equations for estimating aboveground biomass for common shrubs in northeastern California
- Author
-
Steve Huff, Hailemariam Temesgen, and Martin W. Ritchie
- Subjects
0106 biological sciences ,Biomass (ecology) ,010504 meteorology & atmospheric sciences ,Ecology ,ved/biology ,Crown (botany) ,ved/biology.organism_classification_rank.species ,Tree allometry ,Forestry ,Management, Monitoring, Policy and Law ,Seemingly unrelated regressions ,01 natural sciences ,Shrub ,Regression ,Non-linear least squares ,Covariate ,Statistics ,010606 plant biology & botany ,0105 earth and related environmental sciences ,Nature and Landscape Conservation ,Mathematics - Abstract
Selected allometric equations and fitting strategies were evaluated for their predictive abilities for estimating above ground biomass for seven species of shrubs common to northeastern California. Size classes for woody biomass were categorized as 1-h fuels (0.1–0.6 cm), 10-h fuels (0.6–2.5 cm), 100-h fuels (2.5–7.6 cm), and 1000-h fuels (greater than 7.7 cm in diameter). Three fitting strategies were evaluated - weighted nonlinear least squares regression (WNLS), seemingly unrelated regression (SUR), and multinomial log-linear regression (MLR) - to estimate individual shrub biomass as a function of crown area. The inclusion of the shrub height as a covariate did not increase the accuracy of prediction for all species. When MLR was used, on the average, RMSE values were reduced by 23.1% for the 1-h component, by 23.9% for the 10-h component, and by 45.6% for the leaf component for serviceberry when compared to SUR. Based on the residual plots and cross-validation fit statistics, MLR is recommended for estimating AGB for seven major shrub species in California. The equation coefficients are documented for future use.
- Published
- 2017
24. Linear-representation Based Estimation of Stochastic Volatility Models.
- Author
-
FRANCQ, CHRISTIAN and ZAKOÏAN, JEAN-MICHEL
- Subjects
- *
STOCHASTIC processes , *MONTE Carlo method , *PROBABILITY theory , *STATISTICAL sampling , *LINEAR statistical models , *STATISTICS - Abstract
A new way of estimating stochastic volatility models is developed. The method is based on the existence of autoregressive moving average (ARMA) representations for powers of the log-squared observations. These representations allow to build a criterion obtained by weighting the sums of squared innovations corresponding to the different ARMA models. The estimator obtained by minimizing the criterion with respect to the parameters of interest is shown to be consistent and asymptotically normal. Monte-Carlo experiments illustrate the finite sample properties of the estimator. The method has potential applications to other non-linear time-series models. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
25. The least squares estimator of random variables under sublinear expectations
- Author
-
Shaolin Ji and Chuanfeng Sun
- Subjects
0209 industrial biotechnology ,021103 operations research ,Applied Mathematics ,0211 other engineering and technologies ,James–Stein estimator ,Explained sum of squares ,02 engineering and technology ,Generalized least squares ,Newey–West estimator ,Least squares ,020901 industrial engineering & automation ,Non-linear least squares ,Statistics ,Ordinary least squares ,Total least squares ,Analysis ,Mathematics - Abstract
In this paper, we study the least squares estimator for sublinear expectations. Under some mild assumptions, we prove the existence and uniqueness of the least squares estimator. The relationship between the least squares estimator and the conditional coherent risk measure (resp. the conditional g-expectation) is also explored. Then some characterizations of the least squares estimator are given.
- Published
- 2017
26. Power quality prediction based on least squares method
- Author
-
Mingxuan Lu
- Subjects
Recursive least squares filter ,Computer science ,020209 energy ,Materials Science (miscellaneous) ,020208 electrical & electronic engineering ,Explained sum of squares ,Least trimmed squares ,02 engineering and technology ,Generalized least squares ,Industrial and Manufacturing Engineering ,Iteratively reweighted least squares ,Residual sum of squares ,Non-linear least squares ,Statistics ,0202 electrical engineering, electronic engineering, information engineering ,Business and International Management ,Total least squares - Abstract
In the current high degree of popularity of power products, power abnormalities on the production of more and more. Therefore, the prediction of power quality is of great significance. The method of prediction is generally to find the fitting function, the least squares fitting is the commonly used method to find the fitting function. In this paper, the least square method is used to fit the data of power grid. The factors influencing power quality were analyzed from four aspects, and the conclusions were obtained by fitting with real data points. In order to further improve the power quality prediction.
- Published
- 2017
27. The asymptotic behaviour of the residual sum of squares in models with multiple break points
- Author
-
Denise R. Osborn, Alastair R. Hall, and Nikolaos Sakkas
- Subjects
Economics and Econometrics ,Total sum of squares ,05 social sciences ,Explained sum of squares ,Generalized least squares ,Residual sum of squares ,Non-linear least squares ,0502 economics and business ,Statistics ,Applied mathematics ,Lack-of-fit sum of squares ,050207 economics ,Total least squares ,Partition of sums of squares ,050205 econometrics ,Mathematics - Abstract
Models with multiple discrete breaks in parameters are usually estimated via least squares. This paper, first, derives the asymptotic expectation of the residual sum of squares and shows that the number of estimated break points and the number of regression parameters affect the expectation differently. Second, we propose a statistic for testing the joint hypothesis that the breaks occur at specified points in the sample. Our analytical results cover models estimated by the ordinary, nonlinear, and two-stage least squares. An application to U.S. monetary policy rejects the assumption that breaks are associated with changes in the chair of the Fed.
- Published
- 2017
28. A least squares algorithm for fitting data points to a circular arc cam
- Author
-
Shan Lin, Frank Härtig, Otto Jusko, and Jörg Seewig
- Subjects
0209 industrial biotechnology ,Engineering ,business.industry ,Applied Mathematics ,Camshaft ,Monte Carlo method ,02 engineering and technology ,Condensed Matter Physics ,01 natural sciences ,010309 optics ,Arc (geometry) ,Discontinuity (linguistics) ,020901 industrial engineering & automation ,Data point ,Position (vector) ,Non-linear least squares ,0103 physical sciences ,Statistics ,Electrical and Electronic Engineering ,business ,Instrumentation ,Rotation (mathematics) ,Algorithm - Abstract
Precise evaluation of form error is important for quality control in the manufacture of camshafts. For circular arc cams, a conventional method is to fit each arc segment of the cam individually. In such a case, at the connecting points of two fitted segments, there may be discontinuity or non-smoothness. In this paper, a global cam fitting algorithm based on the nonlinear least squares method is proposed. A circular arc cam is represented by the mathematic function in terms of form, rotation and position parameters. By imposing parameter constraints, a closed and smooth profile can be obtained as the result of fitting. In order to evaluate the performance of the proposed algorithm, the uncertainties of the fitted parameters are estimated by the GUM uncertainty framework and Monte Carlo simulations. Compared to the conventional cam fit, the uncertainties obtained by the proposed algorithm are lower. Additionally, the factors which significantly affect the fitting results are specified.
- Published
- 2017
29. A comparative simulation study of bayesian fitting approaches to intravoxel incoherent motion modeling in diffusion-weighted MRI
- Author
-
Peter T. While
- Subjects
Estimation theory ,Bayesian probability ,Estimator ,Bayesian inference ,Least squares ,030218 nuclear medicine & medical imaging ,03 medical and health sciences ,0302 clinical medicine ,Approximation error ,Non-linear least squares ,Statistics ,Radiology, Nuclear Medicine and imaging ,Algorithm ,030217 neurology & neurosurgery ,Intravoxel incoherent motion ,Mathematics - Abstract
Purpose To assess the performance of various least squares and Bayesian modeling approaches to parameter estimation in intravoxel incoherent motion (IVIM) modeling of diffusion-weighted MRI data. Methods Simulated tissue models of different type (breast/liver) and morphology (discrete/continuous) were used to generate noisy data according to the IVIM model at several signal-to-noise ratios. IVIM parameter maps were generated using six different approaches, including full nonlinear least squares (LSQ), segmented least squares (SEG), Bayesian modeling with a Gaussian shrinkage prior (BSP) and Bayesian modeling with a spatial homogeneity prior (FBM), plus two modified approaches. Estimators were compared by calculating the median absolute percentage error and deviation, and median percentage bias. Results The Bayesian modeling approaches consistently outperformed the least squares approaches, with lower relative error and deviation, and provided cleaner parameter maps with reduced erroneous heterogeneity. However, a weakness of the Bayesian approaches was exposed, whereby certain tissue features disappeared completely in regions of high parameter uncertainty. Lower error and deviation were generally afforded by FBM compared with BSP, at the cost of higher bias. Conclusions Bayesian modeling is capable of producing more visually pleasing IVIM parameter maps than least squares approaches, but their potential to mask certain tissue features demands caution during implementation. Magn Reson Med 78:2373–2387, 2017. © 2017 International Society for Magnetic Resonance in Medicine.
- Published
- 2017
30. Expected predictive least squares for model selection in covariance structures
- Author
-
Haruhiko Ogasawara
- Subjects
Statistics and Probability ,Numerical Analysis ,05 social sciences ,050401 social sciences methods ,Generalized least squares ,01 natural sciences ,Least squares ,Iteratively reweighted least squares ,010104 statistics & probability ,0504 sociology ,Residual sum of squares ,Non-linear least squares ,Partial least squares regression ,Statistics ,Lack-of-fit sum of squares ,0101 mathematics ,Statistics, Probability and Uncertainty ,Total least squares ,Mathematics - Abstract
Predictive least squares (PLS) using future data to be predicted by current data are defined in covariance structure analysis. The expected predictive least squares (EPLS) obtained by two-fold expectation of PLS are unknown fit indexes. Using the asymptotic biases of weighted least squares given by current data for estimation of EPLS in covariance structures, corrected least square criteria derived similarly to the Takeuchi information criterion are shown to be asymptotically unbiased under arbitrary distributions. Simulations for model selection in exploratory factor analysis show improvements over typical current fit indexes as RMSEA and AIC.
- Published
- 2017
31. Bias Compensation for Rational Function Model Based on Total Least Squares
- Author
-
Ting Jiang, Anzhu Yu, Gangwu Jiang, Xiangpo Wei, Wenyue Guo, and Yi Zhang
- Subjects
010504 meteorology & atmospheric sciences ,0211 other engineering and technologies ,Explained sum of squares ,02 engineering and technology ,Generalized least squares ,Rational function ,01 natural sciences ,Least squares ,Computer Science Applications ,Polynomial and rational function modeling ,Non-linear least squares ,Statistics ,Earth and Planetary Sciences (miscellaneous) ,Errors-in-variables models ,Computers in Earth Sciences ,Total least squares ,Engineering (miscellaneous) ,021101 geological & geomatics engineering ,0105 earth and related environmental sciences ,Mathematics - Published
- 2017
32. Simultaneous estimation of number of signals and signal parameters of superimposed sinusoidal model: A robust sequential bivariate M-periodogram approach
- Author
-
Sanket Bose, Sharmishtha Mitra, and Amit Mitra
- Subjects
Statistics and Probability ,Sequential estimation ,business.industry ,020206 networking & telecommunications ,Sinusoidal model ,02 engineering and technology ,Bivariate analysis ,M-estimator ,01 natural sciences ,Signal ,010104 statistics & probability ,Modeling and Simulation ,Non-linear least squares ,Statistics ,0202 electrical engineering, electronic engineering, information engineering ,0101 mathematics ,Time series ,business ,Digital signal processing ,Mathematics - Abstract
Accurate estimation of the parameters of superimposed sinusoidal signals is an important problem in digital signal processing and time series analysis. In this article, we propose a simultaneous estimation procedure for estimation of the number of signals and signal parameters. The proposed sequential method is based on a robust bivariate M-periodogram and uses the orthogonal structure of the superimposed sinusoidal model for sequential estimation. Extensive simulations and data analysis show that the proposed method has a high degree of frequency resolution capability and can provide robust and efficient estimates of the number of signals and signal parameters.
- Published
- 2016
33. Estimation and inference for additive partially nonlinear models
- Author
-
Xiaoshuang Zhou, Zehui Liu, and Peixin Zhao
- Subjects
Statistics and Probability ,05 social sciences ,Linear model ,Nonparametric statistics ,Estimator ,01 natural sciences ,010104 statistics & probability ,Nonlinear system ,Empirical likelihood ,Non-linear least squares ,0502 economics and business ,Statistics ,Statistics::Methodology ,Applied mathematics ,0101 mathematics ,050205 econometrics ,Parametric statistics ,Confidence region ,Mathematics - Abstract
In this paper, we extend the additive partially linear model to the additive partially nonlinear model in which the linear part of the additive partially linear model is replaced by a nonlinear function of the covariates. A profile nonlinear least squares estimation procedure for the parameter vector in nonlinear function and the nonparametric functions of the additive partially nonlinear model is proposed and the asymptotic properties of the resulting estimators are established. Furthermore, we apply the empirical likelihood method to the additive partially nonlinear model. An empirical likelihood ratio for the parameter vector and a residual adjusted empirical likelihood ratio for the nonparametric functions have been proposed. Wilks phenomenon is proved and the confidence regions for the parametric vector and the nonparametric functions are constructed. Some simulations have been conducted to assess the performance of the proposed estimating procedures. The results have demonstrated that both the procedures perform well in finite samples. Compared with the results from the empirical likelihood method with those from the profile nonlinear least squares method, the empirical likelihood method performs better in terms of coverage probabilities and average widths of confidence bands.
- Published
- 2016
34. Model-Selection Tests for Complex Survey Samples
- Author
-
Iraj Rahmani and Jeffrey M. Wooldridge
- Subjects
symbols.namesake ,Non-linear least squares ,Model selection ,Statistics ,Test statistic ,symbols ,Survey sampling ,Estimator ,Cluster sampling ,Poisson regression ,Statistic ,Mathematics - Abstract
We extend Vuong’s (1989) model-selection statistic to allow for complex survey samples. As a further extension, we use an M-estimation setting so that the tests apply to general estimation problems – such as linear and nonlinear least squares, Poisson regression and fractional response models, to name just a few – and not only to maximum likelihood settings. With stratified sampling, we show how the difference in objective functions should be weighted in order to obtain a suitable test statistic. Interestingly, the weights are needed in computing the model-selection statistic even in cases where stratification is appropriately exogenous, in which case the usual unweighted estimators for the parameters are consistent. With cluster samples and panel data, we show how to combine the weighted objective function with a cluster-robust variance estimator in order to expand the scope of the model-selection tests. A small simulation study shows that the weighted test is promising.
- Published
- 2019
35. A Parametric Factor Model of the Term Structure of Mortality
- Author
-
Carsten P. T. Rosenskjold and Niels Haldrup
- Subjects
Economics and Econometrics ,Physics::Instrumentation and Detectors ,term structure of mortality ,01 natural sciences ,GeneralLiterature_MISCELLANEOUS ,010104 statistics & probability ,C1 ,0502 economics and business ,Statistics ,ddc:330 ,Range (statistics) ,050207 economics ,0101 mathematics ,GeneralLiterature_REFERENCE(e.g.,dictionaries,encyclopedias,glossaries) ,Parametric statistics ,Mathematics ,cointegration ,State-space representation ,Cointegration ,J10 ,lcsh:HB71-74 ,J11 ,05 social sciences ,mortality forecasting ,lcsh:Economics as a science ,Kalman filter ,factor modelling ,Term (time) ,Non-linear least squares ,G22 ,C22 ,Affine term structure model - Abstract
The prototypical Lee&ndash, Carter mortality model is characterized by a single common time factor that loads differently across age groups. In this paper, we propose a parametric factor model for the term structure of mortality where multiple factors are designed to influence the age groups differently via parametric loading functions. We identify four different factors: a factor common for all age groups, factors for infant and adult mortality, and a factor for the &ldquo, accident hump&rdquo, that primarily affects mortality of relatively young adults and late teenagers. Since the factors are identified via restrictions on the loading functions, the factors are not designed to be orthogonal but can be dependent and can possibly cointegrate when the factors have unit roots. We suggest two estimation procedures similar to the estimation of the dynamic Nelson&ndash, Siegel term structure model. First, a two-step nonlinear least squares procedure based on cross-section regressions together with a separate model to estimate the dynamics of the factors. Second, we suggest a fully specified model estimated by maximum likelihood via the Kalman filter recursions after the model is put on state space form. We demonstrate the methodology for US and French mortality data. We find that the model provides a good fit of the relevant factors and, in a forecast comparison with a range of benchmark models, it is found that, especially for longer horizons, variants of the parametric factor model have excellent forecast performance.
- Published
- 2019
36. Mathematical Models for Tumor Growth and the Reduction of Overtreatment
- Author
-
Berit M. Verbist, Jeroen C. Jansen, Jean-Pierre Bayley, Peter Paul G. van Benthem, John-Melle Bokhorst, Lisa M. H. de Pont, Eleonora P M Corssmit, Andel G. L. van der Mey, Berdine L. Heesterman, Frederik J. Hes, and Medical Genetics
- Subjects
Coefficient of determination ,Mathematical model ,business.industry ,mathematics ,growth ,Gompertz function ,Clinical Neurology ,carotid body tumors ,Regression ,paragangliomas ,Exponential function ,03 medical and health sciences ,vagal body ,models ,0302 clinical medicine ,Goodness of fit ,Non-linear least squares ,Statistics ,Medicine ,Neurology (clinical) ,Analysis of variance ,business ,030217 neurology & neurosurgery - Abstract
Background To improve our understanding of the natural course of head and neck paragangliomas (HNPGL) and ultimately differentiate between cases that benefit from early treatment and those that are best left untreated, we studied the growth dynamics of 77 HNPGL managed with primary observation. Methods Using digitally available magnetic resonance images, tumor volume was estimated at three time points. Subsequently, nonlinear least squares regression was used to fit seven mathematical models to the observed growth data. Goodness of fit was assessed with the coefficient of determination (R 2) and root-mean-squared error. The models were compared with Kruskal–Wallis one-way analysis of variance and subsequent post-hoc tests. In addition, the credibility of predictions (age at onset of neoplastic growth and estimated volume at age 90) was evaluated. Results Equations generating sigmoidal-shaped growth curves (Gompertz, logistic, Spratt and Bertalanffy) provided a good fit (median R 2: 0.996–1.00) and better described the observed data compared with the linear, exponential, and Mendelsohn equations (p Conclusions Growth of HNPGL is best described by decelerating tumor growth laws, with a preference for the Bertalanffy model. To the best of our knowledge, this is the first time that this often-neglected model has been successfully fitted to clinically obtained growth data.
- Published
- 2019
37. Mathematical model of dengue transmission based on daily data in Bandung
- Author
-
Nuning Nuraini, Sapto Wahyu Indratno, and Muhammad Fakhruddin
- Subjects
Transmission (mechanics) ,law ,Computer science ,Stochastic modelling ,Non-linear least squares ,Statistics ,medicine ,Dengue transmission ,medicine.disease ,law.invention ,Dengue fever - Abstract
Dengue is one of the public health issues which is often reported in Bandung. Transmission of the virus is influenced by many factors such as climate which is affecting the life cycle of mosquitoes and human mobility as the virus carriers. An infected individual can die if not treated properly. Here, we study the dengue transmission based on daily data from a reputable hospital (Santo Borromeus) in Bandung. The data was collected from January 1, 2015, to December 31, 2016. We used SIR-SI model to simulate the interaction between human and mosquitoes. The unobserved parameters and initial conditions in the model are estimated by using a nonlinear least squares method. Furthermore, the stochastic model is presented to cover the effect of uncertainties on dengue transmission. In the final part, we presented the simulation of the deterministic and stochastic model compared to reported data. The Stochastic model looks better in capturing dengue data.
- Published
- 2019
38. Parameter Estimation and Sensitivity Analysis of Dysentery Diarrhea Epidemic Model
- Author
-
Hailay Weldegiorgis Berhe, Oluwole Daniel Makinde, and David Mwangi Theuri
- Subjects
0303 health sciences ,Effective transmission rate ,Article Subject ,Estimation theory ,Applied Mathematics ,lcsh:Mathematics ,Dysentery ,010103 numerical & computational mathematics ,medicine.disease ,lcsh:QA1-939 ,01 natural sciences ,Stability (probability) ,03 medical and health sciences ,Stability theory ,Non-linear least squares ,Statistics ,medicine ,Sensitivity (control systems) ,0101 mathematics ,Epidemic model ,030304 developmental biology ,Mathematics - Abstract
In this paper, dysentery diarrhea deterministic compartmental model is proposed. The local and global stability of the disease-free equilibrium is obtained using the stability theory of differential equations. Numerical simulation of the system shows that the backward bifurcation of the endemic equilibrium exists for R0>1. The system is formulated as a standard nonlinear least squares problem to estimate the parameters. The estimated reproduction number, based on the dysentery diarrhea disease data for Ethiopia in 2017, is R0=1.1208. This suggests that elimination of the dysentery disease from Ethiopia is not practical. A graphical method is used to validate the model. Sensitivity analysis is carried out to determine the importance of model parameters in the disease dynamics. It is found out that the reproduction number is the most sensitive to the effective transmission rate of dysentery diarrhea (βh). It is also demonstrated that control of the effective transmission rate is essential to stop the spreading of the disease.
- Published
- 2019
39. Height–diameter relationship of trees in Omo strict nature forest reserve, Nigeria
- Author
-
Iveren B. Chenge
- Subjects
Tree (data structure) ,Standard error ,Calibration (statistics) ,Non-linear least squares ,Economics, Econometrics and Finance (miscellaneous) ,Statistics ,Forestry ,Management, Monitoring, Policy and Law ,Akaike information criterion ,Residual ,Power function ,Plot (graphics) ,Mathematics - Abstract
The height and diameter of trees are important variables in estimating the aboveground biomass of trees. Tree height measurements are often difficult in tropical forests hence tree height–diameter models are often used as an alternative for predicting tree heights. This study was carried out to develop height–diameter models for tree species in Omo strict nature forest reserve in Nigeria. Height and diameter data used for the study comprised of 100 tree species, which were classified into three groups using cluster analysis. Eight commonly used non-linear height–diameter functions were tested for predicting heights of the species groups using nonlinear least squares (Nls) method. The power function was selected based on its performance, and fitted using mixed-effects modeling approach to account for between-plot height variability. The Mixed-effects models were evaluated using the Akaike information criteria and Residual standard error. The models were calibrated using three subsample selection approaches. The best calibration results were obtained using subsamples of four trees from four diameter classes per plot. The best calibrated mixed-effect models outperformed the Nls models for all the tree species groups; increasing the accuracy of tree height prediction.
- Published
- 2021
40. Diagnostics of calibration methods: model adequacy of UV-based determinations
- Author
-
Omer Utku Erzengin and A. Hakan Aktaş
- Subjects
Coefficient of determination ,Applied Mathematics ,010401 analytical chemistry ,Deviance (statistics) ,01 natural sciences ,0104 chemical sciences ,Analytical Chemistry ,010104 statistics & probability ,Residual sum of squares ,Non-linear least squares ,Principal component analysis ,Ordinary least squares ,Statistics ,Partial least squares regression ,0101 mathematics ,Algorithm ,Linear least squares ,Mathematics - Abstract
Models such as ordinary least squares, independent component analysis, principle component analysis, partial least squares, and artificial neural networks can be found in the calibration literature. Linear or nonlinear methods can be used to explain the structure of the same phenomenon. Each type of model has its own advantages with respect to the other. These methods are usually grouped taxonomically, but different models can sometimes be applied to the same data set. Taxonomically, ordinary least square and artificial neural network use completely different analytical procedures but are occasionally applied to the same data set. The aim of the study of methodological superiority is to compare the residuals of models because the model with the minimum error is preferred in real analyses. Calibration models, in general, are based on deterministic and stochastic parts; in other words, the data are equal to the model + the error. Explaining a model solely using statistics such as the coefficient of determination or its related significance values is sometimes inadequate. The errors of a model, also called its residuals, must have minimum variance compared to its alternatives. Additionally, the residuals must be unpredictable, uncorrelated, and symmetric. Under these conditions, the model can be considered adequate. In this study, calibration methods were applied to the raw materials, hydrochlorothiazide and amiloride hydrochloride, of a drug, as well as a sample of the drug tablet. The applied chemical procedure was fast, simple, and reproducible. The various linear and nonlinear calibration methods mentioned above were applied, and the adequacy of the calibration methods was compared according to their residuals.
- Published
- 2016
41. A new lifetime distribution for a series-parallel system: properties, applications and estimations under progressive type-II censoring
- Author
-
Alaa H. Abdel-Hamid and Atef F. Hashem
- Subjects
Statistics and Probability ,021103 operations research ,Uniform distribution (continuous) ,Exponential distribution ,Applied Mathematics ,0211 other engineering and technologies ,02 engineering and technology ,Generalized least squares ,Poisson distribution ,01 natural sciences ,Three-point estimation ,Least squares ,010104 statistics & probability ,symbols.namesake ,Modeling and Simulation ,Non-linear least squares ,Statistics ,Maximum a posteriori estimation ,symbols ,Applied mathematics ,0101 mathematics ,Statistics, Probability and Uncertainty ,Mathematics - Abstract
A compound class of zero truncated Poisson and lifetime distributions is introduced. A specialization is paved to a new three-parameter distribution, called doubly Poisson-exponential distribution, which may represent the lifetime of units connected in a series-parallel system. The new distribution can be obtained by compounding two zero truncated Poisson distributions with an exponential distribution. Among its motivations is that its hazard rate function can take different shapes such as decreasing, increasing and upside-down bathtub depending on the values of its parameters. Several properties of the new distribution are discussed. Based on progressive type-II censoring, six estimation methods [maximum likelihood, moments, least squares, weighted least squares and Bayes (under linear-exponential and general entropy loss functions) estimations] are used to estimate the involved parameters. The performance of these methods is investigated through a simulation study. The Bayes estimates are obtain...
- Published
- 2016
42. Parameter estimation of varying coefficients structural EV model with time series
- Author
-
Hengjian Cui, Kai Can Li, and Yan Yun Su
- Subjects
Series (mathematics) ,Estimation theory ,Applied Mathematics ,General Mathematics ,Asymptotic distribution ,Estimator ,Generalized least squares ,01 natural sciences ,010305 fluids & plasmas ,Rate of convergence ,Non-linear least squares ,0103 physical sciences ,Statistics ,Errors-in-variables models ,Applied mathematics ,010306 general physics ,Mathematics - Abstract
In this paper, the parameters of a p-dimensional linear structural EV (error-in-variable) model are estimated when the coefficients vary with a real variable and the model error is time series. The adjust weighted least squares (AWLS) method is used to estimate the parameters. It is shown that the estimators are weakly consistent and asymptotically normal, and the optimal convergence rate is also obtained. Simulations study are undertaken to illustrate our AWLSEs have good performance.
- Published
- 2016
43. Estimation of the Lomax Distribution in the Presence of Outliers
- Author
-
Mehdi Jabbari Nooghabi
- Subjects
021103 operations research ,0211 other engineering and technologies ,Estimator ,Least trimmed squares ,02 engineering and technology ,Generalized least squares ,M-estimator ,01 natural sciences ,Least squares ,Computer Science Applications ,010104 statistics & probability ,Artificial Intelligence ,Non-linear least squares ,Statistics ,Outlier ,Business, Management and Accounting (miscellaneous) ,Lomax distribution ,0101 mathematics ,Statistics, Probability and Uncertainty ,Mathematics - Abstract
In this paper, we find the moment, maximum likelihood, least squares and weighted least squares estimators of the parameters of Lomax distribution in the presence of outliers. Also, the mixture estimator of these four methods is derived. Further, we discuss about the efficiency of the estimators. Analysis of a simulated data set and an actual example from an insurance company has been presented for illustrative purposes.
- Published
- 2016
44. Comparison of linear and nonlinear implementation of the compartmental tissue uptake model for dynamic contrast-enhanced MRI
- Author
-
Julia A. Schnabel, Michael A. Chappell, Jesper F. Kallehauge, Kari Tanderup, Benjamin Irving, and Steven Sourbron
- Subjects
Accuracy and precision ,Speedup ,Computer science ,Linear model ,030218 nuclear medicine & medical imaging ,Upsampling ,03 medical and health sciences ,Noise ,Nonlinear system ,0302 clinical medicine ,Sampling (signal processing) ,030220 oncology & carcinogenesis ,Non-linear least squares ,Statistics ,Radiology, Nuclear Medicine and imaging ,Algorithm - Abstract
Purpose Fitting tracer kinetic models using linear methods is much faster than using their nonlinear counterparts, although this comes often at the expense of reduced accuracy and precision. The aim of this study was to derive and compare the performance of the linear compartmental tissue uptake (CTU) model with its nonlinear version with respect to their percentage error and precision. Theory and Methods The linear and nonlinear CTU models were initially compared using simulations with varying noise and temporal sampling. Subsequently, the clinical applicability of the linear model was demonstrated on 14 patients with locally advanced cervical cancer examined with dynamic contrast-enhanced magnetic resonance imaging. Results Simulations revealed equal percentage error and precision when noise was within clinical achievable ranges (contrast-to-noise ratio >10). The linear method was significantly faster than the nonlinear method, with a minimum speedup of around 230 across all tested sampling rates. Clinical analysis revealed that parameters estimated using the linear and nonlinear CTU model were highly correlated (ρ ≥ 0.95). Conclusion The linear CTU model is computationally more efficient and more stable against temporal downsampling, whereas the nonlinear method is more robust to variations in noise. The two methods may be used interchangeably within clinical achievable ranges of temporal sampling and noise. Magn Reson Med 77:2414–2423, 2017. © 2016 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
- Published
- 2016
45. Parallelizing Nonlinear Least-Squares Regression with Application to Analyses of Microalgae
- Author
-
Robert K. Byrd and Frank Vogt
- Subjects
Mathematical optimization ,Chemistry ,RSS ,010401 analytical chemistry ,Biochemistry (medical) ,Clinical Biochemistry ,Process (computing) ,02 engineering and technology ,computer.file_format ,Parameter space ,021001 nanoscience & nanotechnology ,01 natural sciences ,Biochemistry ,Regression ,0104 chemical sciences ,Analytical Chemistry ,Parallel processing (DSP implementation) ,Residual sum of squares ,Non-linear least squares ,Statistics ,Electrochemistry ,0210 nano-technology ,Nonlinear regression ,computer ,Spectroscopy - Abstract
Nonlinear least-squares regression is a valuable tool for gaining chemical insights into complex systems. Yet, the success of nonlinear regression as measured by residual sum of squares (RSS), correlation, and reproducibility of fit parameters strongly depends on the availability of a good initial solution. Without such, iterative algorithms quickly become trapped in an unfavorable local RSS-minimum. For determining an initial solution, a high-dimensional parameter space needs to be screened, a process that is very time-consuming but can be parallelized. Another advantage of parallelization is equally important: After determining initial solutions, the used processors can be tasked to each optimize an initial guess. Even if several of these optimizations become stuck in a shallow local RSS-minimum, other processors continue and improve the regression outcome. A software package for parallel processing-based constrained nonlinear regression (RegressionLab) has been developed, implemented, and teste...
- Published
- 2016
46. A robust regression based on weighted LSSVM and penalized trimmed squares
- Author
-
Chengqun Fu, Qin Yu, Yong Wang, Jie Guo, and Jianyong Liu
- Subjects
General Mathematics ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Robust statistics ,General Physics and Astronomy ,Least trimmed squares ,02 engineering and technology ,Generalized least squares ,01 natural sciences ,Robust regression ,010104 statistics & probability ,Statistics ,Least squares support vector machine ,0202 electrical engineering, electronic engineering, information engineering ,0101 mathematics ,Total least squares ,Mathematics ,business.industry ,Applied Mathematics ,Statistical and Nonlinear Physics ,Pattern recognition ,Trimmed estimator ,ComputingMethodologies_PATTERNRECOGNITION ,Non-linear least squares ,020201 artificial intelligence & image processing ,Artificial intelligence ,business - Abstract
Least squares support vector machine (LS-SVM) for nonlinear regression is sensitive to outliers in the field of machine learning. Weighted LS-SVM (WLS-SVM) overcomes this drawback by adding weight to each training sample. However, as the number of outliers increases, the accuracy of WLS-SVM may decrease. In order to improve the robustness of WLS-SVM, a new robust regression method based on WLS-SVM and penalized trimmed squares (WLSSVM–PTS) has been proposed. The algorithm comprises three main stages. The initial parameters are obtained by least trimmed squares at first. Then, the significant outliers are identified and eliminated by the Fast-PTS algorithm. The remaining samples with little outliers are estimated by WLS-SVM at last. The statistical tests of experimental results carried out on numerical datasets and real-world datasets show that the proposed WLSSVM–PTS is significantly robust than LS-SVM, WLS-SVM and LSSVM–LTS.
- Published
- 2016
47. Regressor and disturbance have moments of all orders, least squares estimator has none
- Author
-
Zifeng Zhao and Kenneth D. West
- Subjects
Statistics and Probability ,Disturbance (geology) ,Mean squared error ,05 social sciences ,Generalized least squares ,01 natural sciences ,Least squares ,010104 statistics & probability ,Residual sum of squares ,Non-linear least squares ,0502 economics and business ,Statistics ,Ordinary least squares ,0101 mathematics ,Statistics, Probability and Uncertainty ,050205 econometrics ,Mathematics - Abstract
We construct an example in which the least squares estimator has unbounded bias and no moments, even though the regressor and disturbance have moments of all orders and are independently distributed across observations.
- Published
- 2016
- Full Text
- View/download PDF
48. Trend estimation of multivariate time series with controlled smoothness
- Author
-
L. Leticia Ramirez-Ramirez, Alejandro Islas-Camargo, and Victor M. Guerrero
- Subjects
Statistics and Probability ,Multivariate statistics ,Smoothness (probability theory) ,05 social sciences ,Univariate ,Bivariate analysis ,Generalized least squares ,01 natural sciences ,Least squares ,010104 statistics & probability ,Non-linear least squares ,0502 economics and business ,Statistics ,0101 mathematics ,Smoothing ,050205 econometrics ,Mathematics - Abstract
This paper extends the univariate time series smoothing approach provided by penalized least squares to a multivariate setting, thus allowing for joint estimation of several time series trends. The theoretical results are valid for the general multivariate case, but particular emphasis is placed on the bivariate situation from an applied point of view. The proposal is based on a vector signal-plus-noise representation of the observed data that requires the first two sample moments and specifying only one smoothing constant. A measure of the amount of smoothness of an estimated trend is introduced so that an analyst can set in advance a desired percentage of smoothness to be achieved by the trend estimate. The required smoothing constant is determined by the chosen percentage of smoothness. Closed form expressions for the smoothed estimated vector and its variance-covariance matrix are derived from a straightforward application of generalized least squares, thus providing best linear unbiased estim...
- Published
- 2016
49. Least-squares logistic curves with initial conditions exist
- Author
-
Yves Nievergelt
- Subjects
Statistics and Probability ,010104 statistics & probability ,Non-linear least squares ,Statistics ,Quantitative Biology::Populations and Evolution ,Initial value problem ,010103 numerical & computational mathematics ,0101 mathematics ,Positive data ,01 natural sciences ,Unit (ring theory) ,Least squares ,Mathematics - Abstract
For positive data, there is a least-squares Mitscherlich curve with vanishing initial value and a least-squares Verhulst curve with a unit initial value.
- Published
- 2016
50. The use of Regression Triplet in Hydraulic Modelling on the Example of Determination of Overfall Coefficient
- Author
-
Petr Pelikán and Jana Marková
- Subjects
Engineering ,Spillway ,business.industry ,lcsh:S ,Water jet ,Regression ,spillway ,Volumetric flow rate ,lcsh:Agriculture ,Gravitational constant ,nonlinear regression ,regression diagnostics ,lcsh:Biology (General) ,Non-linear least squares ,Statistics ,Applied mathematics ,flow rate ,General Agricultural and Biological Sciences ,business ,lcsh:QH301-705.5 ,Nonlinear regression ,Regression diagnostic ,small water reservoir - Abstract
The paper is focused on a hydraulic problem of water overfall on hydrotechnic structures, especially outlets and spillways of water reservoirs. The main parameter of such structures is its discharge capacity depending on overfall coefficient, dimensions of spillway, gravitational constant and height of overflowing water jet. The aim of investigation was the mathematical derivation of formula for calculation of overfall coefficient for sharp-crested spillway from observed data. The problem was solved with the aid of statistical method of nonlinear regression analysis, Gauss-Newton algorithm (nonlinear least squares). The objective of investigation was achieved by the design of new equation providing high confidential results.
- Published
- 2016
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.