99 results on '"interval estimates"'
Search Results
2. Ensemble of Gray Fuzzy Cognitive Maps Application to Solve the Problem of Predicting University Activity
- Author
-
Mikryukov, Andrey A., Mazurov, Mikhail E., Xhafa, Fatos, Series Editor, Hu, Zhengbing, editor, Petoukhov, Sergey, editor, and He, Matthew, editor
- Published
- 2022
- Full Text
- View/download PDF
3. Analyze Phase: A Is for Analyze
- Author
-
Pakdil, Fatma and Pakdil, Fatma
- Published
- 2020
- Full Text
- View/download PDF
4. Towards Analyzing the Impact of Diversity and Cardinality on the Quality of Collective Prediction Using Interval Estimates
- Author
-
Du Nguyen, Van, Truong, Hai Bang, Nguyen, Ngoc Thanh, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Woeginger, Gerhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Wotawa, Franz, editor, Friedrich, Gerhard, editor, Pill, Ingo, editor, Koitz-Hristov, Roxane, editor, and Ali, Moonis, editor
- Published
- 2019
- Full Text
- View/download PDF
5. IMPROVING THE EFFICIENCY OF URBAN TRANSPORT INFRASTRUCTURE BASED ON DIGITAL TECHNOLOGIES
- Author
-
S. N. Gagarina, N. N. Chausov, and V. N. Levkina
- Subjects
digital technologies ,efficiency ,interval estimates ,life quality ,network models ,optimization methods ,transport infrastructure ,uncertainty ,Sociology (General) ,HM401-1281 ,Economics as a science ,HB71-74 - Abstract
The need to improve the efficiency of transport infrastructure, which is an important subsystem of urban services as a determinant of the quality of life of the city’s population, has been substantiated. The factors that determine the quality of the urban transport system, the features of urban transport have been highlighted. Transport infrastructure development in Russia has been analysed. It has been proved that in the conditions of the formation of the digital economy, artificial intelligence systems are an effective tool for decision-making. In the formation of intelligent systems for managing urban transport flows, the use of network models has been proposed, for which mathematical methods are necessary to obtain not only point, but also interval estimates of the model parameters, taking into account a priori uncertainty.
- Published
- 2020
- Full Text
- View/download PDF
6. Applying interval PCA and clustering to quantile estimates: empirical distributions of fertilizer cost estimates for yearly crops in European Countries.
- Author
-
Desbois, Dominique
- Subjects
- *
CLIMATE change mitigation , *PRINCIPAL components analysis , *AGRICULTURAL productivity , *CROPS , *FERTILIZERS - Abstract
The decision to adopt one or another of the sustainable land management alternatives should not be based solely on their respective benefits in terms of climate change mitigation but also based on the performances of the productive systems used by farm holdings, assessing their environmental impacts through the cost of fertilizer resources used. This communication uses the symbolic clustering tools in order to analyze the conditional quantile estimates of the fertilizer costs of yearly crop productions in agriculture, as a replacement proxy for internal soil erosion costs. After recalling the conceptual framework of the estimation of agricultural production costs, we present the empirical data model, the quantile regression approach and the interval principal component analysis clustering tools used to obtain typologies of European countries on the basis of the conditional quantile distributions of fertilizer cost empirical estimates. The comparative analysis of econometric results for yearly crops between European countries illustrates the relevance of the typologies obtained for international comparisons to assess land management alternatives based on their impact on agricultural carbon sequestration in soils. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
7. Statistical Inference
- Author
-
Lindley, D. V. and Macmillan Publishers Ltd
- Published
- 2018
- Full Text
- View/download PDF
8. Rubin Causal Model
- Author
-
Imbens, Guido W., Rubin, Donald B., and Macmillan Publishers Ltd
- Published
- 2018
- Full Text
- View/download PDF
9. Toward evaluating the level of crowd wisdom using interval estimates.
- Author
-
Nguyen, Van Du, Truong, Hai Bang, Merayo, Mercedes G., Nguyen, Ngoc Thanh, Szczerbicki, Edward, and Trawiński, Bogdan
- Subjects
- *
SWARM intelligence , *ESTIMATES , *PREDICTION theory - Abstract
Recently, the use of the wisdom of crowds (WoC) for finding solutions to a wide range of real-life problems has dramatically expanded. Prior studies have revealed that diversity, independence, decentralization, and aggregation are the determinants of collective wisdom. However, these findings are often based on the so-called point estimates - single values are used as the representations of individual predictions on the task of estimating unknown quantities or predicting outcomes of future events. In some situations, interval values, which are often called interval estimates, can be used for such representations. Accordingly, one can provide an individual prediction in the form of an interval value including a lower and an upper bounds. Taking into account this kind of representation, in this paper, we present a case study in which collectives of randomly selected predictions can outperform those of most accurate predictions. Then, we evaluate the WoC level by taking into account diversity and cardinality. The computational experiments have indicated that diversity is positively related to collective wisdom. Finally, we discuss some related theoretical and practical implications for further research. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
10. Application of the entropic coefficient for interval number optimization during interval assessment
- Author
-
Tynynyka A. N.
- Subjects
entropy coefficient ,grouping intervals number ,interval estimates ,Rayleigh distribution ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
In solving many statistical problems, the most precise choice of the distribution law of a random variable is required, the sample of which the authors observe. This choice requires the construction of an interval series. Therefore, the problem arises of assigning an optimal number of intervals, and this study proposes a number of formulas for solving it. Which of these formulas solves the problem more accurately? In [9], this question is investigated using the Pearson criterion. This article describes the procedure and on its basis gives formulas available in literature and proposed new formulas using the entropy coefficient. A comparison is made with the previously published results of applying Pearson's concord criterion for these purposes. Differences in the estimates of the accuracy of the formulas are found. The proposed new formulas for calculating the number of intervals showed the best results. Calculations have been made to compare the work of the same formulas for the distribution of sample data according to the normal law and the Rayleigh law.
- Published
- 2017
- Full Text
- View/download PDF
11. Approximations of One-dimensional Expected Utility Integral of Alternatives Described with Linearly-Interpolated p- Boxes
- Author
-
Nikolova, N. D., Ivanova, S., Tenekedjiev, K., Kacprzyk, Janusz, Series editor, Guo, Peijun, editor, and Pedrycz, Witold, editor
- Published
- 2014
- Full Text
- View/download PDF
12. A Novel Genetic Algorithmic Approach for Computing Real Roots of a Nonlinear Equation
- Author
-
Nadimpalli, Vijaya Lakshmi V., Wankar, Rajeev, Chillarige, Raghavendra Rao, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Esparcia-Alcázar, Anna I., editor, and Mora, Antonio M., editor
- Published
- 2014
- Full Text
- View/download PDF
13. 'Potential Interval of Root' of Nonlinear Equation: Labeling Algorithm
- Author
-
Nadimpalli, Vijaya Lakshmi V., Wankar, Rajeev, Chillarige, Raghavendra Rao, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Kobsa, Alfred, Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Nierstrasz, Oscar, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Goebel, Randy, Series editor, Tanaka, Yuzuru, Series editor, Wahlster, Wolfgang, Series editor, Siekmann, Jörg, Series editor, Murty, M. Narasimha, editor, He, Xiangjian, editor, Chillarige, Raghavendra Rao, editor, and Weng, Paul, editor
- Published
- 2014
- Full Text
- View/download PDF
14. On interval estimates of perturbations of generalized eigenvalues for diagonalizable pairs.
- Author
-
Xu, Wei-wei, Ma, Li-juan, Zhu, Lei, and Liu, Hao
- Subjects
- *
PERTURBATION theory , *GENERALIZABILITY theory , *EIGENVALUES , *ESTIMATION theory , *MATHEMATICAL bounds - Abstract
Abstract On generalized eigenvalue perturbation bounds for diagonalizable matrix pairs upper bounds are discussed all the time. In this paper we mainly consider not only upper bounds but also lower bounds of perturbation of generalized eigenvalues for diagonalizable matrix pairs. Sharper upper bounds and sharp lower bounds on perturbation of generalized eigenvalues for diagonalizable matrix pairs are obtained in terms of the distances of two points in Grassmann manifold. The main results improve the existing results of Chen (2007) [1] and numerical examples illustrate the efficiency of the theoretical results. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
15. A DEB model for European sea bass (Dicentrarchus labrax): Parameterisation and application in aquaculture.
- Author
-
Stavrakidis-Zachou, Orestis, Papandroulakis, Nikos, and Lika, Konstadia
- Subjects
- *
EUROPEAN seabass , *BIOENERGETICS , *AQUACULTURE , *FISH feeds , *PARAMETERIZATION - Abstract
Abstract The framework provided by the Dynamic Energy Budget (DEB) theory allows the quantification of metabolic processes and the associated biological rates that are of interest for aquaculture, such as growth and feeding. The DEB parameters were estimated for farmed European sea bass (Dicentrarchus labrax), a species of major importance for the Mediterranean aquaculture, using zero- and uni-variate literature data and achieving an overall good fit. The obtained parameter set was used to validate the model on sites representatively covering the geographic distribution of the aquaculture activity in Greece via comparison of model predictions to observations. Inter-individual variability of farmed fish was introduced through: 1) an individual initial weight and 2) a factor that acts as an individual-specific multiplier for some of the model parameters and produces scatter in maximum size, and age and size at puberty. Growth of E. sea bass was adequately predicted by the model while feeding tended to be underestimated, particularly during the period following the summer months when warmer temperatures promote high growth rates. The results suggest robustness of the model since it is able to simulate growth and food intake in several independent aquaculture production units, using a common parameter set. The accuracy of growth predictions supports the applicability of the model in variable environmental conditions in the context of climate change. Reconstruction of the feeding history from growth data revealed variations in the scaled functional response (f), i.e., the feeding rate as fraction of maximum possible one of an individual of a given size, throughout the production cycle. However, model simulations with constant f result in reasonably good predictions for growth and feeding in variable environmental conditions. Tendency of the model to underestimate the feeding process revealed both model weaknesses associated with higher temperatures as well as irregularities in the feeding protocols applied at the farm level. Our work demonstrates the capacity and potential of DEB theory for further development of tools that contribute to the assessment and improvement of feeding practices in aquaculture. Highlights • DEB parameter estimation for European sea bass and application to aquaculture • Model validation using production data from different aquaculture sites • Robustness of growth predictions allows simulation of climate change implications. • The model can provide insight towards improvement of food efficiency in aquaculture. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
16. Fitting multiple models to multiple data sets.
- Author
-
Marques, Gonçalo M., Lika, Konstadia, Augustine, Starrlight, Pecquerie, Laure, and Kooijman, Sebastiaan A.L.M.
- Subjects
- *
BIOLOGICAL systems , *BIOENERGETICS , *PARAMETER estimation , *CONFIDENCE intervals , *MONTE Carlo method - Abstract
Abstract Dynamic Energy Budget (DEB) theory constitutes a coherent set of universal biological processes that have been used as building blocks for modeling biological systems over the last 40 years in many applied disciplines. In the context of extracting parameters for DEB models from data, we discuss the methodology of fitting multiple models, which share parameters, to multiple data sets in a single parameter estimation. This problem is not specific to DEB models, and is (or should be) really general in biology. We discovered that a lot of estimation problems that we suffered from in the past originated from the use of a loss function that was not symmetric in the role of data and predictions. We here propose two much better symmetric candidates, that proved to work well in practice. We illustrate estimation problems and their solutions with a Monte-Carlo case study for increasing amount of scatter, which decreased the amount of information in the data about one or more parameter values. We here validate the method using a set of models with known parameters and different scatter structures. We compare the loss functions on the basis of convergence, point and interval estimates. We also discuss the use of pseudo-data, i.e. realistic values for parameters that we treat as data from which predictions can differ. These pseudo-data are used to avoid that a good fit results in parameter values that make no biological sense. We discuss our new method for estimating confidence intervals and present a list of concrete recommendations for parameter estimation. We conclude that the proposed method performs very well in recovering parameter values of a set of models, applied to a set of data. This is consistent with our large-scale applications in practice. Highlights • A new method of fitting multiple models to multiple datasets is proposed. • General methodology to arrive at interval estimates for parameter values is proposed. • Two loss functions are compared on the basis of convergence and parameter estimates. • The amount of scatter, but not its cause, strongly affects parameter estimates. • Pseudo-data are used to increase identifiability of parameters. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
17. Metallurgy in the Czech Republic: a spatio-temporal view
- Author
-
J. Suchacek, A. Samolejova, and P. Seda
- Subjects
metallurgy sector ,interval estimates ,input-output ,probability distribution ,Czech Republic ,Mining engineering. Metallurgy ,TN1-997 - Abstract
The objective of this paper is to introduce the stochastic input-output model of the impact of metallurgy sector on the Czech economy. Contrary to original input-output model, which is of deterministic nature, we reckon with interval estimates of the development of metallurgy sector. They help us to surpass deterministic impediments when analyzing and forecasting the possible developmental tendencies of metallurgy sector in various economies.
- Published
- 2017
18. Reliability assessment of condensing thermal power plants
- Author
-
Milovanović Zdravko N. and Dumonjić-Milovanović Svetlana R.
- Subjects
condensing thermal power plant ,reliability ,reliability parameters ,interval estimates ,Engineering (General). Civil engineering (General) ,TA1-2040 - Abstract
Considered a complex system of condensing power plants. For a given system are provided methods for estimating reliability and data reliability. The classification of failure and damage. For some indicia of reliability are defined interval values. They represent basis for the estimates of reliability.
- Published
- 2015
- Full Text
- View/download PDF
19. Statistical properties of four effect-size measures for mediation models.
- Author
-
Miočević, Milica, O'Rourke, Holly P., MacKinnon, David P., and Brown, Hendricks C.
- Subjects
- *
MEDIATION (Statistics) , *BAYESIAN analysis , *ESTIMATION theory , *EFFECT sizes (Statistics) , *STATISTICAL bias - Abstract
This project examined the performance of classical and Bayesian estimators of four effect size measures for the indirect effect in a single-mediator model and a two-mediator model. Compared to the
proportion andratio mediation effect sizes, standardized mediation effect-size measures were relatively unbiased and efficient in the single-mediator model and the two-mediator model. Percentile and bias-corrected bootstrap interval estimates ofab /s Y , andab (s X )/s Y in the single-mediator model outperformed interval estimates of theproportion andratio effect sizes in terms of power, Type I error rate, coverage, imbalance, and interval width. For the two-mediator model, standardized effect-size measures were superior to theproportion andratio effect-size measures. Furthermore, it was found that Bayesian point and interval summaries of posterior distributions of standardized effect-size measures reduced excessive relative bias for certain parameter combinations. The standardized effect-size measures are the best effect-size measures for quantifying mediated effects. [ABSTRACT FROM AUTHOR]- Published
- 2018
- Full Text
- View/download PDF
20. COGNITIVE SKILLS AND CONFIDENCE: INTERRELATIONS WITH OVERESTIMATION, OVERPLACEMENT AND OVERPRECISION.
- Author
-
Duttle, Kai
- Subjects
RAVEN'S Progressive Matrices ,COGNITIVE testing ,FALSE precision (Statistics) ,CRITICAL thinking ,CALIBRATION ,ECONOMIC research - Abstract
ABSTRACT This experimental study measures the three major types of judgmental overconfidence in a within-subjects design. Performance-based overestimation and overplacement are elicited in a Raven Progressive Matrices test for general intelligence. Calibration-based overprecision is evaluated in a forecasting by confidence intervals task. This study takes a closer look at the interrelations of these different types. Moreover, as the main focus, it considers the effect of cognitive abilities on overconfidence. These are quantified in a cognitive reflection test. I find that cognitive skills are substantially interrelated with subjects' confidence levels. Although overconfidence in absolute terms (overestimation) is not affected by cognitive abilities, the effect on overconfidence in relative terms (overplacement) is positive and significant. Overprecision, the calibration-based type of overconfidence, is found to be significantly affected by cognitive capacity as well. Interval forecasts of cognitively more able subjects were much better calibrated than those of their peers who displayed substantial overconfidence in the precision of their forecasts. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
21. Confidence Regions in Econometrics of Complex Variables Доверительные области в эконометрии комплексных переменных
- Author
-
Chanysheva Amina F.
- Subjects
econometrics ,complex-valued economy ,limits ,joint confidence regions ,interval estimates ,linear regression equation ,Hotelling statistics ,эконометрика ,комплекснозначная экономика ,доверительные границы ,совместные доверительные области ,интервальные оценки ,линейное уравнение регрессии ,статистика Хотеллинга (Hotelling). ,Business ,HF5001-6182 - Abstract
The article is a result of a complex study devoted to laying the foundation of the complex-valued economy – modern prospective direction in the field of socio-economic forecasting. The article is devoted to the issues of obtaining interval estimates for complex-valued linear regression equations. The author substantiates selection of the form of confidence regions and also considers three approaches to its building up. On the basis of the study, the article draws a conclusion that practically all factual values of the observed variable are inside the confidence region, calculated for confidence probability 95%. This testifies to a good selection of the model to the original data and effectiveness of the method of finding confidence regions offered in this article.Данная статья является результатом комплексного исследования, посвященного заложению основ комплекснозначной экономики – современного перспективного направления в области социально-экономического прогнозирования. Статья посвящена вопросам нахождения интервальных оценок для комплекснозначных уравнений линейной регрессии. Автор обосновывает выбор формы доверительных областей, а также рассматривает три подхода к их построению. В результате исследований сделан вывод, что практически все фактические значения наблюдаемой переменной находятся внутри доверительной области, рассчитанной для доверительной вероятности 95%. Это говорит о хорошем подборе модели к исходным данным и об эффективности метода нахождения доверительных областей, предложенного в данной работе.
- Published
- 2013
22. Inferring uncertainty from interval estimates: Effects of alpha level and numeracy
- Author
-
Luke F. Rinne and Michèle M. M. Mazzocco
- Subjects
interval estimates ,probability judgment ,numeracy ,numerical cognition ,decision-making ,Social Sciences ,Psychology ,BF1-990 - Abstract
Interval estimates are commonly used to descriptively communicate the degree of uncertainty in numerical values. Conventionally, low alpha levels (e.g., .05) ensure a high probability of capturing the target value between interval endpoints. Here, we test whether alpha levels and individual differences in numeracy influence distributional inferences. In the reported experiment, participants received prediction intervals for fictitious towns’ annual rainfall totals (assuming approximately normal distributions). Then, participants estimated probabilities that future totals would be captured within varying margins about the mean, indicating the approximate shapes of their inferred probability distributions. Results showed that low alpha levels (vs. moderate levels; e.g., .25) more frequently led to inferences of over-dispersed approximately normal distributions or approximately uniform distributions, reducing estimate accuracy. Highly numerate participants made more accurate estimates overall, but were more prone to inferring approximately uniform distributions. These findings have important implications for presenting interval estimates to various audiences.
- Published
- 2013
- Full Text
- View/download PDF
23. A simple remedy for overprecision in judgment
- Author
-
Uriel Haran, Don A. Moore, and Carey K. Morewedge
- Subjects
overconfidence ,overprecision ,subjective probability ,interval estimates ,judgment and decision making ,Social Sciences ,Psychology ,BF1-990 - Abstract
Overprecision is the most robust type of overconfidence. We present a new method that significantly reduces this bias and offers insight into its underlying cause. In three experiments, overprecision was significantly reduced by forcing participants to consider all possible outcomes of an event. Each participant was presented with the entire range of possible outcomes divided into intervals, and estimated each interval’s likelihood of including the true answer. The superiority of this Subjective Probability Interval Estimate (SPIES) method is robust to range widths and interval grain sizes. Its carryover effects are observed even in subsequent estimates made using the conventional, 90% confidence interval method: judges who first made SPIES judgments considered a broader range of values in subsequent conventional interval estimates as well.
- Published
- 2010
- Full Text
- View/download PDF
24. Anwendung von Intervall-PCA und Klassifizierung auf Quantilschätzungen: empirische Verteilungen von Düngemittelkostenschätzungen für einjährige Kulturen in europäischen Ländern
- Author
-
Desbois, Dominique, Economie Publique (ECO-PUB), Institut National de la Recherche Agronomique (INRA)-AgroParisTech, ANR, ANR-16-NME1-0008,ASSESS,impActs and feedbackS between climate and Soil affected by EroSion - cost in terms of carbon Storage in Mediterranean regions(2016), and European Project: 609475,EC:FP7:INCO,FP7-INCO-2013-3,ERANETMED(2013)
- Subjects
JEL: D - Microeconomics ,JEL: M - Business Administration and Business Economics • Marketing • Accounting • Personnel Economics ,quantile regression ,input–output model ,[QFIN]Quantitative Finance [q-fin] ,Principal component analysis ,fertilizer ,agricultural production cost ,yearly crops ,JEL: Q - Agricultural and Natural Resource Economics • Environmental and Ecological Economics/Q.Q5 - Environmental Economics ,interval estimates ,symbolic data analysis ,JEL: M - Business Administration and Business Economics • Marketing • Accounting • Personnel Economics/M.M2 - Business Economics ,micro-economics ,[SHS.ENVIR]Humanities and Social Sciences/Environmental studies ,hierarchic clustering - Abstract
International audience; The decision to adopt one or another of the sustainable land management alternatives should not be based solely on their respective benefits in terms of climate change mitigation but also based on the performances of the productive systems used byfarm holdings, assessing their environmental impacts through the cost of fertilizer resources used. This communication uses the symbolic clustering tools in order to analyze the conditional quantile estimates of thefertilizer costs of yearly crop productionsin agriculture, as a replacement proxy for internal soil erosion costs. After recalling the conceptual framework of the estimation of agricultural production costs, we present the empirical data model, the quantile regression approach and the interval principal component analysis clustering tools used to obtain typologies of European countries on the basis of the conditional quantile distributions of fertilizer cost empirical estimates. The comparative analysis of econometric results for yearly crops between European countries illustrates the relevance of the typologies obtained for international comparisons to assess land management alternatives based on their impact on agricultural carbon sequestration in soils.
- Published
- 2021
25. Dating the financial cycle with uncertainty estimates: a wavelet proposition.
- Author
-
Ardila, Diego and Sornette, Didier
- Abstract
We propose to date and analyze the financial cycle using the Maximum Overlap Discrete Wavelet Transform (MODWT). Our presentation points out limitations of the methods derived from the classical business cycle literature, while stressing their connection with wavelet analysis. The fundamental time-frequency uncertainty principle imposes replacing point estimates of turning points by interval estimates, which are themselves function of the scale of the analysis. We use financial time series from 19 OECD countries to illustrate the applicability of the tool. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
26. Methodology of analysis of unseparated mixtures: Interval estimates of the total concentration of similar analytes.
- Author
-
Vershinin, V., Isachenko, N., and Brilenok, N.
- Subjects
- *
ALGORITHMS , *ESTIMATION theory , *ESTIMATES , *ALGEBRA , *LEAST squares - Abstract
Determination of the total concentration ( c) of similar analytes recalculated to a standard substance X is a widely used but metrologically incorrect procedure leading to a high uncertainty of the results of analysis. An algorithm is proposed for the interval estimation of c without recalculating to the concentration of X. The algorithm takes into account different sensitivities of the determination of analytes of the studied type. No information on the nature and ratio of analytes in the sample is used. The width of the range of possible values of c objectively characterizes the systematic component of the error, it is independent of the choice of X, exceeds the width of traditional confidence intervals, and becomes zero at equal sensitivities of the determination of all analytes in a group. The accuracy of interval estimates is supported by the analysis of model mixtures of different types using several methods of measuring the total signal. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
27. Fitting multiple models to multiple data sets
- Author
-
Sebastiaan A.L.M. Kooijman, Starrlight Augustine, Gonçalo M. Marques, Konstadia Lika, Laure Pecquerie, Instituto Superior Técnico, Technical University of Lisbon, University of Crete [Heraklion] (UOC), Akvaplan-Niva [Tromsø], Norwegian Institute for Water Research (NIVA), Institut de Recherche pour le Développement (IRD), Laboratoire des Sciences de l'Environnement Marin (LEMAR) (LEMAR), Institut Universitaire Européen de la Mer (IUEM), Institut de Recherche pour le Développement (IRD)-Institut national des sciences de l'Univers (INSU - CNRS)-Université de Brest (UBO)-Centre National de la Recherche Scientifique (CNRS)-Institut de Recherche pour le Développement (IRD)-Institut national des sciences de l'Univers (INSU - CNRS)-Université de Brest (UBO)-Centre National de la Recherche Scientifique (CNRS)-Institut Français de Recherche pour l'Exploitation de la Mer (IFREMER)-Centre National de la Recherche Scientifique (CNRS)-Université de Brest (UBO), Vrije universiteit = Free university of Amsterdam [Amsterdam] (VU), Institut de Recherche pour le Développement (IRD)-Institut Français de Recherche pour l'Exploitation de la Mer (IFREMER)-Université de Brest (UBO)-Institut Universitaire Européen de la Mer (IUEM), Institut de Recherche pour le Développement (IRD)-Institut national des sciences de l'Univers (INSU - CNRS)-Université de Brest (UBO)-Centre National de la Recherche Scientifique (CNRS)-Institut national des sciences de l'Univers (INSU - CNRS)-Université de Brest (UBO)-Centre National de la Recherche Scientifique (CNRS)-Centre National de la Recherche Scientifique (CNRS), VU University Amsterdam, Molecular Cell Biology, AIMMS, and Theoretical Life Sciences
- Subjects
0106 biological sciences ,Context (language use) ,Monte Carlo simulation studies ,Interval (mathematics) ,Aquatic Science ,Oceanography ,010603 evolutionary biology ,01 natural sciences ,Point estimates ,Set (abstract data type) ,SDG 17 - Partnerships for the Goals ,Fitting models ,Convergence (routing) ,Parameter estimation ,Point estimation ,Ecology, Evolution, Behavior and Systematics ,parameters ,Basis (linear algebra) ,Estimation theory ,ACL ,010604 marine biology & hydrobiology ,Interval estimates ,Function (mathematics) ,Loss function ,covariation method ,[SDE.BE]Environmental Sciences/Biodiversity and Ecology ,Algorithm - Abstract
WOS:000453497600006; Dynamic Energy Budget (DEB) theory constitutes a coherent set of universal biological processes that have been used as building blocks for modeling biological systems over the last 40 years in many applied disciplines. In the context of extracting parameters for DEB models from data, we discuss the methodology of fitting multiple models, which share parameters, to multiple data sets in a single parameter estimation. This problem is not specific to DEB models, and is (or should be) really general in biology. We discovered that a lot of estimation problems that we suffered from in the past originated from the use of a loss function that was not symmetric in the role of data and predictions. We here propose two much better symmetric candidates, that proved to work well in practice. We illustrate estimation problems and their solutions with a Monte-Carlo case study for increasing amount of scatter, which decreased the amount of information in the data about one or more parameter values. We here validate the method using a set of models with known parameters and different scatter structures. We compare the loss functions on the basis of convergence, point and interval estimates. We also discuss the use of pseudo-data, i.e. realistic values for parameters that we treat as data from which predictions can differ. These pseudo-data are used to avoid that a good fit results in parameter values that make no biological sense. We discuss our new method for estimating confidence intervals and present a list of concrete recommendations for parameter estimation. We conclude that the proposed method performs very well in recovering parameter values of a set of models, applied to a set of data. This is consistent with our large-scale applications in practice.
- Published
- 2019
28. METALLURGY IN THE CZECH REPUBLIC: A SPATIO-TEMPORAL VIEW.
- Author
-
SUCHACEK, J., SAMOLEJOVA, A., and SEDA, P.
- Subjects
- *
METALLURGY , *METAL industry , *SPATIOTEMPORAL processes , *INPUT-output analysis , *DETERMINISTIC processes - Abstract
The objective of this paper is to introduce the stochastic input-output model of the impact of metallurgy sector on the Czech economy. Contrary to original input-output model, which is of deterministic nature, we reckon with interval estimates of the development of metallurgy sector. They help us to surpass deterministic impediments when analyzing and forecasting the possible developmental tendencies of metallurgy sector in various economies. [ABSTRACT FROM AUTHOR]
- Published
- 2017
29. Consistency of Bayesian Estimates for the Sum of Squared Normal Means with a Normal Prior.
- Author
-
Evans, Michael and Shakhatreh, Mohammed
- Abstract
We consider the problem of estimating the sum of squared means when the data ( x,..., x) are independent values with x ∼ N( θ, 1) and θ, θ... are a priori i.i.d. N(0, σ) with σ known. This example has posed difficulties for many approaches to inference. We examine the consistency properties of several estimators derived from Bayesian considerations. We prove that a particular Bayesian estimate (LRSE) is consistent in a wider set of circumstances than other Bayesian estimates like the posterior mean and mode. We show that the LRSE is either equal to the positive part of the UMVUE or differs from it with a relative error no greater than 2/ n. We also prove a consistency result for interval estimation and discuss checking for prior-data conflict. While it can be argued that the choice of the N(0, σ) prior is inappropriate when σ is chosen large to reflect noninformativity, this argument is not applicable when σ is chosen to reflect knowledge about the unknowns. As such it is important to show that there are consistent Bayesian estimation procedures using this prior. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
30. Application de la classification par intervalles aux quantiles estimés : distributions empiriques des estimations du coût des engrais pour les pays européens
- Author
-
Desbois, Dominique, Economie Publique (ECO-PUB), AgroParisTech-Université Paris-Saclay-Institut National de Recherche pour l’Agriculture, l’Alimentation et l’Environnement (INRAE), 'impActs and feedbackS between climat and Soil affected by EroSion: cost in terms of carbon Storage in Mediterranean regions', in short 'ASSESS', ASMDA International, and ANR-16-NME1-0008,ASSESS,impActs and feedbackS between climate and Soil affected by EroSion - cost in terms of carbon Storage in Mediterranean regions(2016)
- Subjects
quantile regression ,principal component analysis ,distributions empiriques ,input-output model ,agricultural production cost ,interval estimates ,agricultural product ,specific cost ,JEL: D - Microeconomics/D.D2 - Production and Organizations/D.D2.D24 - Production • Cost • Capital • Capital, Total Factor, and Multifactor Productivity • Capacity ,classification symbolique ,[STAT.AP]Statistics [stat]/Applications [stat.AP] ,JEL: D - Microeconomics ,[QFIN]Quantitative Finance [q-fin] ,intervalles d'estimation ,fertilizer ,[QFIN.ST]Quantitative Finance [q-fin]/Statistical Finance [q-fin.ST] ,[SHS.ECO]Humanities and Social Sciences/Economics and Finance ,European countries ,yearly crops ,symbolic data analysis ,micro-economics ,produit agricole : pays européens ,empirical distributions ,hierarchic clustering ,coût spécifique ,symbolic clustering ,JEL: C - Mathematical and Quantitative Methods/C.C4 - Econometric and Statistical Methods: Special Topics/C.C4.C46 - Specific Distributions • Specific Statistics ,régression quantile - Abstract
International audience; The decision to adopt one or another of the sustainable land management alternatives should not be based solely on their respective benefits in terms of climate change mitigation but also based on the performances of the productive systems used by farm holdings, assessing their environmental impacts through the cost of specific resources used. This communication uses the symbolic clustering tools in order to analyse the conditional quantile estimates of the fertilizer costs of specific productions in agriculture, as a replacement proxy for internal soil erosion costs. After recalling the conceptual framework of the estimation of agricultural production costs, we present the empirical data model, the quantile regression approach and the symbolic clustering tools used to obtain typologies of European countries on the basis of the conditional quantile distributions of fertilizer cost empirical estimates. The comparative analysis of econometric results for main products between European countries illustrates the relevance of the typologies obtained for international comparisons based on their input specific productivity.; La décision d'adopter l'une ou l'autre des alternatives de gestion durable des terres ne devrait pas être fondée uniquement sur leurs avantages respectifs en termes d'atténuation du changement climatique, mais également sur les performances des systèmes de production utilisés par les exploitations agricoles, en évaluant leurs impacts environnementaux à travers le coût des ressources spécifiques utilisées. Cette communication mobilise les outils de la classification symbolique afin d'analyser les estimations quantitatives conditionnelles des coûts des engrais de productions spécifiques en agriculture, en tant que substitut des coûts internes de l'érosion des sols. Après avoir rappelé le cadre conceptuel de l'estimation des coûts de production agricole, nous présentons le modèle de données empiriques, l'approche de régression quantile et les outils de classification symbolique utilisés pour obtenir des typologies de pays européens sur la base des estimations empiriques des distributions quantile conditionnelles des coûts des engrais. L'analyse comparative des résultats économétriques pour les principaux produits entre les pays européens illustre la pertinence des typologies obtenues pour les comparaisons internationales basées sur la productivité spécifique de leurs intrants.
- Published
- 2020
31. Applying Interval Clustering to Quantile Estimates: Empirical Distributions of Fertilizer Cost Estimates for European Countries
- Author
-
Desbois, Dominique, Economie Publique (ECO-PUB), AgroParisTech-Université Paris-Saclay-Institut National de Recherche pour l’Agriculture, l’Alimentation et l’Environnement (INRAE), ASMDA International, and ANR-16-NME1-0008,ASSESS,impActs and feedbackS between climate and Soil affected by EroSion - cost in terms of carbon Storage in Mediterranean regions(2016)
- Subjects
quantile regression ,principal component analysis ,distributions empiriques ,input-output model ,agricultural production cost ,interval estimates ,agricultural product ,specific cost ,JEL: D - Microeconomics/D.D2 - Production and Organizations/D.D2.D24 - Production • Cost • Capital • Capital, Total Factor, and Multifactor Productivity • Capacity ,classification symbolique ,[STAT.AP]Statistics [stat]/Applications [stat.AP] ,JEL: D - Microeconomics ,[QFIN]Quantitative Finance [q-fin] ,intervalles d'estimation ,fertilizer ,[QFIN.ST]Quantitative Finance [q-fin]/Statistical Finance [q-fin.ST] ,[SHS.ECO]Humanities and Social Sciences/Economics and Finance ,European countries ,yearly crops ,symbolic data analysis ,micro-economics ,empirical distributions ,produit agricole : pays européens ,symbolic clustering ,hierarchic clustering ,coût spécifique ,JEL: C - Mathematical and Quantitative Methods/C.C4 - Econometric and Statistical Methods: Special Topics/C.C4.C46 - Specific Distributions • Specific Statistics ,régression quantile - Abstract
International audience; The decision to adopt one or another of the sustainable land management alternatives should not be based solely on their respective benefits in terms of climate change mitigation but also based on the performances of the productive systems used by farm holdings, assessing their environmental impacts through the cost of specific resources used. This communication uses the symbolic clustering tools in order to analyse the conditional quantile estimates of the fertilizer costs of specific productions in agriculture, as a replacement proxy for internal soil erosion costs. After recalling the conceptual framework of the estimation of agricultural production costs, we present the empirical data model, the quantile regression approach and the symbolic clustering tools used to obtain typologies of European countries on the basis of the conditional quantile distributions of fertilizer cost empirical estimates. The comparative analysis of econometric results for main products between European countries illustrates the relevance of the typologies obtained for international comparisons based on their input specific productivity.; La décision d'adopter l'une ou l'autre des alternatives de gestion durable des terres ne devrait pas être fondée uniquement sur leurs avantages respectifs en termes d'atténuation du changement climatique, mais également sur les performances des systèmes de production utilisés par les exploitations agricoles, en évaluant leurs impacts environnementaux à travers le coût des ressources spécifiques utilisées. Cette communication mobilise les outils de la classification symbolique afin d'analyser les estimations quantitatives conditionnelles des coûts des engrais de productions spécifiques en agriculture, en tant que substitut des coûts internes de l'érosion des sols. Après avoir rappelé le cadre conceptuel de l'estimation des coûts de production agricole, nous présentons le modèle de données empiriques, l'approche de régression quantile et les outils de classification symbolique utilisés pour obtenir des typologies de pays européens sur la base des estimations empiriques des distributions quantile conditionnelles des coûts des engrais. L'analyse comparative des résultats économétriques pour les principaux produits entre les pays européens illustre la pertinence des typologies obtenues pour les comparaisons internationales basées sur la productivité spécifique de leurs intrants.
- Published
- 2020
32. True Overconfidence in Interval Estimates: Evidence Based on a New Measure of Miscalibration.
- Author
-
Glaser, Markus, Langer, Thomas, and Weber, Martin
- Subjects
BEHAVIORAL economics ,ENDOWMENT effect (Economics) ,INVESTORS ,INVESTMENT bankers ,INVESTMENT advisors ,ECONOMIC forecasting - Abstract
ABSTRACT Overconfidence is often regarded as one of the most prevalent judgment biases. Several studies show that overconfidence can lead to suboptimal decisions of investors, managers, or politicians. Recent research, however, questions whether overconfidence should be regarded as a bias and shows that standard 'overconfidence' findings can easily be explained by different degrees of knowledge of agents plus a random error in predictions. We contribute to the current literature and ongoing research by extensively analyzing interval estimates for knowledge questions, for real financial time series, and for artificially generated charts. We thereby suggest a new method to measure overconfidence in interval estimates, which is based on the implied probability mass behind a stated prediction interval. We document overconfidence patterns, which are difficult to reconcile with rationality of agents and which cannot be explained by differences in knowledge as differences in knowledge do not exist in our task. Furthermore, we show that overconfidence measures are reliable in the sense that there exist stable individual differences in the degree of overconfidence in interval estimates, thereby testing an important assumption of behavioral economics and behavioral finance models: stable individual differences in the degree of overconfidence across people. We do this in a 'field experiment,' for different levels of expertise of subjects (students on the one hand and professional traders and investment bankers on the other hand), over time, by using different miscalibration metrics, and for tasks that avoid common weaknesses such as a non-representative selection of trick questions. Copyright © 2012 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
33. Моделювання внутрішньої валюти в рефлексивних іграх з багатокритеріальними функціями виграшу
- Author
-
Sergey A. Smirnov and Ivan M. Tereshchenko
- Subjects
Computational Theory and Mathematics ,Artificial Intelligence ,Computer science ,Multi criteria ,Currency ,Applied Mathematics ,Reflexivity ,Stochastic game ,рефлексивные игры ,многокритериальные функции выиграша ,метод внутренней валюты ,метод линейной свертки ,интервальные оценки ,reflexive games ,multi-criteria payoff functions ,inner currency method ,linear convolution method ,interval estimates ,Mathematical economics ,рефлексивні ігри ,багатокритеріальні функції виграшу ,метод внутрішньої валюти ,метод лінійної згортки ,інтервальні оцінки ,Theoretical Computer Science - Abstract
The problem of decision-making under conditions of the conflict, multi-objective uncertainty and reflexive interaction of the parties is considered. Modeling of reflexive behavior makes it possible to analyze situations when the decisions taken differ from the non-reflexive rationality, to investigate and reveal the internal causes for such behavior. The solution of this problem, taking into account the multi-valued interests of the parties, is based on a multi-criteria generalization of the proposed V. Lefebvre setting, based on the use of the concept of inner currency. To calculate the initial assessment of the opponent's inner currency on the basis of nominally known criteria, the interval estimates method was used. Its application enables experts to set a range of possible values of weight coefficients, without requiring the definition of their specific values, which simplifies the expert procedure. The further refinement of the weighting factors occurs by solving an auxiliary problem for finding corrections that are introduced into the model for determining the inner currency., Рассмотрена задача принятия решений в условиях конфликта, многокритериальной неопределенности и рефлексивного взаимодействия игроков. Моделирование рефлексивного поведения дает возможность анализировать ситуации, когда принятые решения отличаются от нерефлексивного рационального поведения, исследовать и выявлять внутренние причины такого поведения. Решение проблемы с учетом многозначности интересов сторон основано на многокритериальном обобщении предложенной В.А. Лефевром постановки, базирующейся на использовании понятия внутренней валюты. Для вычисления начальной оценки внутренней валюты противника на основе номинально известных критериев использован метод интервальных оценок. Его применение дает возможность экспертам задавать диапазон возможных значений весовых коэффициентов без определения их конкретных значений, что упрощает экспертную процедуру. Весовые коэффициенты уточнены путем решения вспомогательной задачи с поиска поправок, которые вносятся в модель определения внутренней валюты., Розглянуто завдання прийняття рішень в умовах конфлікту, багатокритеріальної невизначеності та рефлексивної взаємодії гравців. Моделювання рефлексивної поведінки дає змогу аналізувати ситуації, коли прийняті рішення відрізняються від нерефлексивної раціональної поведінки, та дослідити і виявити внутрішні причини такої поведінки. Розв’язання проблеми з огляду на багатозначність інтересів сторін ґрунтується на багатокритеріальному узагальненні запропонованої В.О. Лефевром постановки, що базується на використанні поняття внутрішньої валюти. Для обчислення початкової оцінки внутрішньої валюти супротивника на основі номінально відомих критеріїв використано метод інтервальних оцінок, що дає змогу експертам задавати діапазон можливих значень вагових коефіцієнтів без визначення їх конкретних значень і спрощувати експертну процедуру. Вагові коефіцієнти уточнено розв’язуванням допоміжної оптимізаційної задачі з пошуку поправок, унесених до моделі визначення внутрішньої валюти.
- Published
- 2018
34. Inferring uncertainty from interval estimates: Effects of alpha level and numeracy.
- Author
-
Rinne, Luke F. and Mazzocco, Michèle M. M.
- Subjects
- *
NUMERACY , *DECISION making , *JUDGMENT (Psychology) , *COGNITION research , *DISTRIBUTION (Probability theory) - Abstract
Interval estimates are commonly used to descriptively communicate the degree of uncertainty in numerical values. Conventionally, low alpha levels (e.g., .05) ensure a high probability of capturing the target value between interval endpoints. Here, we test whether alpha levels and individual differences in numeracy influence distributional inferences. In the reported experiment, participants received prediction intervals for fictitious towns' annual rainfall totals (assuming approximately normal distributions). Then, participants estimated probabilities that future totals would be captured within varying margins about the mean, indicating the approximate shapes of their inferred probability distributions. Results showed that low alpha levels (vs. moderate levels; e.g., .25) more frequently led to inferences of over-dispersed approximately normal distributions or approximately uniform distributions, reducing estimate accuracy. Highly numerate participants made more accurate estimates overall, but were more prone to inferring approximately uniform distributions. These findings have important implications for presenting interval estimates to various audiences. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
35. Определение суммарного содержания фенольных антиоксидантов в чае с применением разных вариантов метода FRAP
- Author
-
Tsypko, T. G., Brilenok, N. S., Guschaeva, K. S., Vershinin, V. I., Цюпко, Т. Г., Бриленок, Н. С., Гущаева, К. С., Вершинин, В. И., Tsypko, T. G., Brilenok, N. S., Guschaeva, K. S., Vershinin, V. I., Цюпко, Т. Г., Бриленок, Н. С., Гущаева, К. С., and Вершинин, В. И.
- Abstract
The total content (c.) of phenolic antioxidants (PhA) in black tea samples was determined at 10-3 mol/g level using different variations of the FRAP assay. The spectrometric analysis was performed in two independent laboratories using different subsidiary reagents (o-phenanthroline or 2.2.-dipyridyl) and different exposure times (60 or 10 min). The results of the analysis (c*) of tea samples were recalculated to different standard substances (Xst), namely gallic and ascorbic acids. The obtained data defined the total antioxidant activity of tea samples. The use of a weaker oxidizing agent (Fe3 ++ 2.2.-dipyridyl) in each case led to slightly lower values of c* (mM-eq/g) than the use of another agent (Fe3+ + o-phenanthroline). For the two samples the discrepancies were statistically reliable and for all samples they did not exceed 20% rel. The calculations obtained with the additives method or with the calibration curve (all other things being equal) led to the coincident values of c*. This fact indicated that the tea infusions did not contain any substances that reduced the sensitivity of PhA determination, and particularly did not include iron-binding complexants. The intervals of possible с. values were calculated for each sample taking into account the intragroup selectivity of the signals. The width of this interval defined the uncertainty of PhA total content data. The calculated interval estimates included c* values obtained with another Xst. These intervals were an order of magnitude wider than the confidence intervals calculated with the Student algorithm, which considers only the measurement precision. The approximate coincidence of the interval estimates obtained with the different standard substances, makes the FRAP assay appropriate not only for calculating the total index “antioxidant activity”, but also for an objective estimate of the total PhA content. Obviously, the FRAP assay can be taken as a foundation of a new way to control the tea quality; this, Суммарное содержание (с.) фенольных антиоксидантов (ФА) в образцах черного чая определяли на уровне 10-3 моль/г, используя разные варианты метода FRAP. Спектрофотометрический анализ образцов вели в двух независимых лабораториях, применяя разные реагенты (о-фенантролин и 2.2.-дипиридил) и разные времена экспозиции (60 и 10 минут). Результаты анализа (с*), выраженные в пересчете на разные стандартные вещества (галловую и аскорбиновую кислоты), характеризуют антиоксидантную активность чая (АОА). Применение более слабой окислительной системы (Fe3+ + 2.2.-дипиридил) ведет к несколько меньшим значениям с*, выраженным в мМ-экв/г, чем применение системы (Fe3+ + о-фенантролин). Для части образцов расхождения статистически достоверны, но во всех случаях они не превышают 20 % отн. Расчеты АОА по способу добавок и по градуировочному графику дают совпадающие значения, что указывает на отсутствие в чайных настоях веществ, снижающих чувствительность определения ФА, в частности железосвязывающих комплексантов. Интервальные оценки с. рассчитывали по найденным значениям с* с учетом внутригрупповой селективности сигналов. Полученные интервалы на порядок шире доверительных интервалов, вычисляемых по Стьюденту с учетом прецизионности измерений, и включают значения с*, полученные с применением другого стандартного вещества (Хст). Приблизительное совпадение интервалов возможных значений с., полученных с применением разных методик измерения сигналов и разных , позволяет считать метод FRAP пригодным не только для вычисления интегрального показателя Хст АОА, но и для объективной оценки суммарного содержания ФА в чае. Очевидно, метод FRAP может стать основой нового способа контроля качества чая, менее чувствительного к влиянию посторонних веществ, чем традиционно применяемый в контрольно-аналитических лабораториях метод Фолина-Чокальтеу. Выражение результатов анализа чая по методу FRAP без пересчета на Хст обеспечит их метрологическую корректность. Актуальной и пока что нерешенной задачей явл
- Published
- 2019
36. Point and Interval Estimates of Percentile Ranks for Scores on the Texas Functional Living Scale.
- Author
-
Crawford, John R., Cullum, C. Munro, Garthwaite, Paul H., Lycett, Emma, and Allsopp, Kate J.
- Subjects
- *
PERCENTILES , *BAYESIAN analysis , *ACTIVITIES of daily living scales , *PSYCHOMETRICS - Abstract
Point and interval estimates of percentile ranks are useful tools in assisting with the interpretation of neurocognitive test results. We provide percentile ranks for raw subscale scores on the Texas Functional Living Scale (TFLS; Cullum, Weiner, & Saine, 2009) using the TFLS standardization sample data (N = 800). Percentile ranks with interval estimates are also provided for the overall TFLS T score. Conversion tables are provided along with the option of obtaining the point and interval estimates using a computer program written to accompany this paper (TFLS_PRs.exe). The percentile ranks for the subscales offer an alternative to using the cumulative percentage tables in the test manual and provide a useful and quick way for neuropsychologists to assimilate information on the case's profile of scores on the TFLS subscales. The provision of interval estimates for the percentile ranks is in keeping with the contemporary emphasis on the use of confidence intervals in psychological statistics [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
37. THE STUDY OF THE BOOTSTRAP ESTIMATE ACCURACY IN THE CASE OF EXPONENTIAL DISTRIBUTION.
- Author
-
MORARIU, Cristin-Olimpiu, ZAHARIA, Sebastian Marian, and UDROIU, Răzvan
- Subjects
RELIABILITY in engineering ,CALCULUS ,MAXIMUM likelihood statistics ,STATISTICAL bootstrapping ,STATISTICAL models - Abstract
In reliability studies are many situations in which the point estimator is very difficult to obtain and the exact interval estimates are practically impossible to calculate. In these cases where the form of is complicated, the calculus of interval estimates are based on the asymptotically properties of maximum likelihood method. In recent years a computer-intensive technique called the bootstrap estimate that was developed to be used for this kind of problems. The paper presents how to use these estimators in the case of exponential distribution. This type of statistical model was chosen because the point estimator and the interval estimates are determined very simple with accuracy. Also, the study has been extended in the case of incomplete samples, commonly used in the studies of the reliability. [ABSTRACT FROM AUTHOR]
- Published
- 2012
38. Optimization methods in multi-criteria decision making analysis with interval information on the importance of criteria and values of scale gradations.
- Author
-
Nelyubin, A. and Podinovski, V.
- Abstract
Accurate and efficient numerical methods for the solution of optimization problems that arise in the comparison of solution preferences with the methods of the theory of criteria importance in the case of interval estimates of degrees of superiority of certain criteria over others, as well as in the case of interval restrictions on the growth of preferences along the criteria range, are suggested. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
39. Percentile Norms and Accompanying Interval Estimates from an Australian General Adult Population Sample for Self-Report Mood Scales (BAI, BDI, CRSD, CES-D, DASS, DASS-21, STAI-X, STAI-Y, SRDS, and SRAS).
- Author
-
Crawford, John, Cayley, Carol, Lovibond, Peter F, Wilson, Peter H, and Hartley, Caroline
- Subjects
- *
AFFECT (Psychology) , *ANALYSIS of variance , *CHI-squared test , *COMPUTER software , *CONFIDENCE intervals , *STATISTICAL correlation , *EXPERIMENTAL design , *MATHEMATICAL models , *PROBABILITY theory , *REFERENCE values , *RELIABILITY (Personality trait) , *RESEARCH funding , *SELF-evaluation , *SEX distribution , *STATISTICS , *EDUCATIONAL attainment - Abstract
Despite their widespread use, many self-report mood scales have very limited normative data. To rectify this, Crawford et al. have recently provided percentile norms for a series of self-report scales. The present study aimed to extend the work of Crawford et al. by providing percentile norms for additional mood scales based on samples drawn from the general Australian adult population. Participants completed a series of self-report mood scales. The resultant normative data were incorporated into a computer programme that provides point and interval estimates of the percentile ranks corresponding to raw scores for each of the scales. The programme can be used to obtain point and interval estimates of the percentile ranks of an individual's raw scores on the Beck Anxiety Inventory, the Beck Depression Inventory, the Carroll Rating Scale for Depression, the Centre for Epidemiological Studies Rating Scale for Depression, the Depression, Anxiety, and Stress Scales (DASS), the short-form version of the DASS (DASS-21), the Self-rating Scale for Anxiety, the Self-rating Scale for Depression, the State-Trait Anxiety Inventory (STAI), form X, and the STAI, form Y, based on normative sample sizes ranging from 497 to 769. The interval estimates can be obtained using either classical or Bayesian methods as preferred. The programme (which can be downloaded at ) provides a convenient and reliable means of obtaining the percentile ranks of individuals' raw scores on self-report mood scales. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
40. Fuzzy rationality and parameter elicitation in decision analysis.
- Author
-
Nikolova, Natalia D. and Tenekedjiev, Kiril I.
- Subjects
- *
DECISION making , *FUZZY sets , *SET theory , *METHODOLOGY , *FUZZY systems - Abstract
It is widely recognised by decision analysts that real decision-makers always make estimates in an interval form. An overview of techniques to find an optimal alternative among such with imprecise and interval probabilities is presented. Scalarisation methods are outlined as most appropriate. A proper continuation of such techniques is fuzzy rational (FR) decision analysis. A detailed representation of the elicitation process influenced by fuzzy rationality is given. The interval character of probabilities leads to the introduction of ribbon functions, whose general form and special cases are compared with the p-boxes. As demonstrated, approximation of utilities in FR decision analysis does not depend on the probabilities, but the approximation of probabilities is dependent on preferences. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
41. Framing of imprecise quantities: When are lower interval bounds preferred to upper bounds?
- Author
-
Halberg, Anne-Marie and Teigen, Karl Halvor
- Subjects
PHYSICAL constants ,QUANTITATIVE research ,NEGATION (Logic) ,COMMUNICATION ,PRAGMATICS - Abstract
Imprecisely known quantities (e.g., predictions) are often described in approximate terms as “more than X” or “less than Y” (e.g., “Ann will earn more than $50 000” or “less than $60 000”). Such phrases carry both quantitative and qualitative (pragmatic) information. Three studies are reported showing that lower limit estimates (more than, over, minimum) are generally more frequent, and considered more appropriate than upper limit estimates (less than, under, maximum) over a wide range of contexts. This is partly due to scalar properties of the number system, where lower numbers are attained before, and included in higher numbers, but not vice versa. As a result, upper limit statements are perceived as negations, and carry more specific information about the speaker's communicative concerns. Upper limit statements are preferred with amounts or quantities that can be perceived as small, whereas lower limit statements can be used both to indicate large quantities and as a default. Copyright © 2009 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
42. Maximum vs. minimum values: Preferences of speakers and listeners for upper and lower limit estimates
- Author
-
Halberg, Anne-Marie, Teigen, Karl Halvor, and Fostervold, Knut Inge
- Subjects
- *
UNCERTAINTY , *PREFERENCES (Philosophy) , *ESTIMATES , *MAXIMA & minima , *PRICES , *DISTANCES - Abstract
Abstract: Estimates about uncertain quantities can be expressed in terms of lower limits (more than X, minimum X), or upper limits (less than Y, maximum Y). It has been shown that lower limit statements generally occur much more often than upper limit statements (). However, in a conversational context, preferences for upper and lower limit statements will be moderated by the concerns of the interlocutors. We report three studies asking speakers and listeners about their preferences for lower and upper limit statements, in the domains of distances, durations, and prices. It appears that travellers prefer information about maximum distances and maximum durations, and buyers (but not sellers) prefer to be told about maximum prices and maximum delivery times. Mistaken maxima are at the same time regarded as more “wrong” than mistaken minima. However, this preference for “worst case” information is not necessarily shared by providers of information (advisors), who are also concerned about being blamed if wrong. [Copyright &y& Elsevier]
- Published
- 2009
- Full Text
- View/download PDF
43. On percentile norms in neuropsychology: Proposed reporting standards and methods for quantifying the uncertainty over the percentile ranks of test scores.
- Author
-
Crawford, John R., Garthwaite, Paul H., and Slick, Daniel J.
- Subjects
- *
NEUROPSYCHOLOGICAL tests , *NEUROPSYCHOLOGY , *COMPUTER software , *UNCERTAINTY , *PSYCHOPHYSIOLOGY - Abstract
Normative data for neuropsychological tests are often presented in the form of percentiles. One problem when using percentile norms stems from uncertainty over the definitional formula for a percentile. (There are three co-existing definitions and these can produce substantially different results.) A second uncertainty stems from the use of a normative sample to estimate the standing of a raw score in the normative population. This uncertainty is unavoidable but its extent can be captured using methods developed in the present paper. A set of reporting standards for the presentation of percentile norms in neuropsychology is proposed. An accompanying computer program (available to download) implements these standards and generates tables of point and interval estimates of percentile ranks for new or existing normative data. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
44. Confidence Intervals for a New Characteristic of Central Tendency of Distributions.
- Author
-
Fabián, Zdeněk
- Subjects
- *
CONFIDENCE intervals , *STATISTICAL sampling , *RANDOM variables , *DISTRIBUTION (Probability theory) , *INTERVAL analysis - Abstract
The t-mean is a new characteristic of the central tendency of continuous distributions. In this article, we introduce a t-difference in the sample space, which is used for a construction of confidence intervals for the t-mean. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
45. Bayes' theorem and diagnostic tests in neuropsychology: Interval estimates for post-test probabilities.
- Author
-
Crawford, John R., Garthwaite, Paul H., and Betkowska, Karolina
- Subjects
- *
BAYES' theorem , *NEUROPSYCHOLOGICAL tests , *SENSITIVITY & specificity (Statistics) , *MONTE Carlo method , *PROBABILITY measures - Abstract
Most neuropsychologists are aware that, given the specificity and sensitivity of a test and an estimate of the base rate of a disorder, Bayes' theorem can be used to provide a post-test probability for the presence of the disorder given a positive test result (and a post-test probability for the absence of a disorder given a negative result). However, in the standard application of Bayes' theorem the three quantities (sensitivity, specificity, and the base rate) are all treated as fixed, known quantities. This is very unrealistic as there may be considerable uncertainty over these quantities and therefore even greater uncertainty over the post-test probability. Methods of obtaining interval estimates on the specificity and sensitivity of a test are set out. In addition, drawing and extending upon work by Mossman and Berger (2001), a Monte Carlo method is used to obtain interval estimates for post-test probabilities. All the methods have been implemented in a computer program, which is described and made available (www.abdn.ac.uk/~psy086/dept/BayesPTP.htm). When objective data on the base rate are lacking (or have limited relevance to the case at hand) the program elicits opinion for the pre-test probability. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
46. More than, less than, or minimum, maximum: how upper and lower bounds determine subjective interval estimates.
- Author
-
Teigen, Karl Halvor, Halberg, Anne-Marie, and Fostervold, Knut Inge
- Subjects
INTERVAL measurement ,UNCERTAINTY ,CONFIDENCE intervals ,AIRLINE tickets ,PRICES - Abstract
Uncertain quantities can be described by single-point estimates of lower interval bounds (X
1 ), upper interval bounds (X2 ), two-bound estimates (separate estimates of X1 and X2 ), and by ranges (X1 -X2 ). A price estimation task showed that single-bound estimates phrased as “T costs more than X1 ” and “T costs less than X2 ,” yielded much larger intervals than “minimum X1 ” and “maximum X2 .” This difference can be attributed to exclusive interpretations of X1 and X2 in the first case (X1 and X2 are unlikely values), and inclusive interpretations in the second (X1 and X2 are likely values). This pattern of results was replicated in other domains where participants estimated single targets. When they estimated a distribution of targets, the pattern was reversed. “Minimum” and “maximum” values of variable quantities (e.g., flight prices) were found to delimit larger intervals than “more than” and “less than” estimates. Copyright © 2006 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]- Published
- 2007
- Full Text
- View/download PDF
47. Application of the entropic coefficient for interval number optimization during interval assessment
- Author
-
A. N. Tynynyka
- Subjects
grouping intervals number ,interval estimates ,Statistics ,Interval (graph theory) ,Applied mathematics ,Rayleigh distribution ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,entropy coefficient ,lcsh:TK1-9971 ,Mathematics - Abstract
In solving many statistical problems, the most precise choice of the distribution law of a random variable is required, the sample of which the authors observe. This choice requires the construction of an interval series. Therefore, the problem arises of assigning an optimal number of intervals, and this study proposes a number of formulas for solving it. Which of these formulas solves the problem more accurately? In [9], this question is investigated using the Pearson criterion. This article describes the procedure and on its basis gives formulas available in literature and proposed new formulas using the entropy coefficient. A comparison is made with the previously published results of applying Pearson's concord criterion for these purposes. Differences in the estimates of the accuracy of the formulas are found. The proposed new formulas for calculating the number of intervals showed the best results. Calculations have been made to compare the work of the same formulas for the distribution of sample data according to the normal law and the Rayleigh law.
- Published
- 2017
48. Interval estimates of the solution statistical model of management
- Author
-
Tsibriy L. V. and Muliar S. S.
- Subjects
stochastic programming ,multiple regression ,simulation model ,interval estimates ,lcsh:TA1-2040 ,statistical model ,lcsh:Architecture ,lcsh:Engineering (General). Civil engineering (General) ,optimal management ,lcsh:NA1-9428 - Abstract
Annotation. Goal. Create a model of managing a complex system based on statistical data and find the optimal management solution. Method. The construction of a statistical model of a complex system that reflects the whole range of its parameters and relationships is the subject of many studies. All of them are based on methods of mathematical statistics and lead to the model of multiple regression [1,2,4]. The aim of the simulation is not only to evaluate the parameters of the system's operation, but also to manage it. This leads to the need to solve the problem of stochastic programming [3]. There is no single analytical method for solving such problems. The paper outlines one of the approaches that allows solving the problem of finding the best control with the help of the simulation model considering the consideration of interval regression estimates as a random effect provided by the mathematical model of stochastic programming. This allows us to find not only point estimates of the deterministic optimal solution, which occurs when reducing the problem of stochastic programming to the problem of nonlinear programming. Results. The application of the proposed method leads to the receipt of interval estimates of the controlled variables and the target function of the optimization problem, corresponding to the interval estimates of the regression of the explained parameters on the explanatory parameters. Scientific novelty. A method for solving the control problem of a complex system is proposed, which makes it possible to take into account the stochastic nature of the model and to find not only point, but also interval estimates of the optimal solution. Practical significance. Interval estimates of the decision of the statistical model, taking into account the random spread of statistical data, are necessary for making the right decision when choosing the parameters of the system being designed, which will allow to take into account possible undesirable random effects.
- Published
- 2017
49. Estimating blood vessel areas in ultrasound images using a deformable template model.
- Author
-
Husby, Oddvar and Rue, Håvard
- Subjects
- *
MEDICAL imaging systems , *STOCHASTIC processes , *ALGORITHMS , *MARKOV processes , *MANAGEMENT science , *PROBABILITY theory - Abstract
We consider the problem of obtaining interval estimates of vessel areas from ultrasound images of cross sections through the carotid artery. Robust and automatic estimates of the cross sectional area is of medical interest and of help in diagnosing atherosclerosis, which is caused by plaque deposits in the carotid artery. We approach this problem by using a deformable template to model the blood vessel outline, and use recent developments in ultrasound science to model the likelihood. We demonstrate that by using an explicit model for the outline, we can easily adjust for an important feature in the data: strong edge reflections called specular reflection. The posterior is challenging to explore, and naive standard MCMC algorithms simply converge too slowly. To obtain an efficient MCMC algorithm we make extensive use of computational efficient Gaussian Markov random fields, and use various block sampling constructions that jointly update large parts of the model. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
50. Posterior probability intervals for wavelet thresholding.
- Author
-
Barber, Stuart, Nason, Guy P., and Silverman, Bernard W.
- Subjects
BAYESIAN analysis ,CUSUM technique ,CURVE fitting ,INTERVAL analysis ,WAVELETS (Mathematics) - Abstract
Summary. We use cumulants to derive Bayesian credible intervals for wavelet regression estimates. The first four cumulants of the posterior distribution of the estimates are expressed in terms of the observed data and integer powers of the mother wavelet functions. These powers are closely approximated by linear combinations of wavelet scaling functions at an appropriate finer scale. Hence, a suitable modification of the discrete wavelet transform allows the posterior cumulants to be found efficiently for any given data set. Johnson transformations then yield the credible intervals themselves. Simulations show that these intervals have good coverage rates, even when the underlying function is inhomogeneous, where standard methods fail. In the case where the curve is smooth, the performance of our intervals remains competitive with established nonparametric regression methods. [ABSTRACT FROM AUTHOR]
- Published
- 2002
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.