32 results on '"uncertainty intervals"'
Search Results
2. Construction and evaluation of a practical model for measuring health-adjusted life expectancy (HALE) in China
- Author
-
San Qian Chen, Yu Cao, Jing Jie Ma, Xing Chao Zhang, and Song Bo Hu
- Subjects
China ,Years lived with disability rate ,Health-adjusted life expectancy ,Practical model ,Uncertainty intervals ,Public aspects of medicine ,RA1-1270 - Abstract
Abstract Background HALE is now a regular strategic planning indicator for all levels of the Chinese government. However, HALE measurements necessitate comprehensive data collection and intricate technology. Therefore, effectively converting numerous diseases into the years lived with disability (YLD) rate is a significant challenge for HALE measurements. Our study aimed to construct a simple YLD rate measurement model with high applicability based on the current situation of actual data resources within China to address challenges in measuring HALE target values during planning. Methods First, based on the Chinese YLD rate in the Global Burden of Disease (GBD) 2019, Pearson correlation analysis, the global optimum method, etc., was utilized to screen the best predictor variables from the current Chinese data resources. Missing data for predictor variables were filled in via spline interpolation. Then, multiple linear regression models were fitted to construct the YLD rate measurement model. The Sullivan method was used to measure HALE. The Monte Carlo method was employed to generate 95% uncertainty intervals. Finally, model performances were assessed using the mean absolute error (MAE) and mean absolute percentage error (MAPE). Results A three-input-parameter model was constructed to measure the age-specific YLD rates by sex in China, directly using the incidence of infectious diseases, the incidence of chronic diseases among persons aged 15 and older, and the addition of an under-five mortality rate covariate. The total MAE and MAPE for the combined YLD rate were 0.0007 and 0.5949%, respectively. The MAE and MAPE of the combined HALE in the 0-year-old group were 0.0341 and 0.0526%, respectively. There were slightly fewer males (0.0197, 0.0311%) than females (0.0501, 0.0755%). Conclusion We constructed a high-accuracy model to measure the YLD rate in China by using three monitoring indicators from the Chinese national routine as predictor variables. The model provides a realistic and feasible solution for measuring HALE at the national and especially regional levels, considering limited data.
- Published
- 2024
- Full Text
- View/download PDF
3. Construction and evaluation of a practical model for measuring health-adjusted life expectancy (HALE) in China.
- Author
-
Chen, San Qian, Cao, Yu, Ma, Jing Jie, Zhang, Xing Chao, and Hu, Song Bo
- Subjects
LIFE expectancy ,INDEPENDENT variables ,MONTE Carlo method ,PEARSON correlation (Statistics) ,GLOBAL burden of disease - Abstract
Background: HALE is now a regular strategic planning indicator for all levels of the Chinese government. However, HALE measurements necessitate comprehensive data collection and intricate technology. Therefore, effectively converting numerous diseases into the years lived with disability (YLD) rate is a significant challenge for HALE measurements. Our study aimed to construct a simple YLD rate measurement model with high applicability based on the current situation of actual data resources within China to address challenges in measuring HALE target values during planning. Methods: First, based on the Chinese YLD rate in the Global Burden of Disease (GBD) 2019, Pearson correlation analysis, the global optimum method, etc., was utilized to screen the best predictor variables from the current Chinese data resources. Missing data for predictor variables were filled in via spline interpolation. Then, multiple linear regression models were fitted to construct the YLD rate measurement model. The Sullivan method was used to measure HALE. The Monte Carlo method was employed to generate 95% uncertainty intervals. Finally, model performances were assessed using the mean absolute error (MAE) and mean absolute percentage error (MAPE). Results: A three-input-parameter model was constructed to measure the age-specific YLD rates by sex in China, directly using the incidence of infectious diseases, the incidence of chronic diseases among persons aged 15 and older, and the addition of an under-five mortality rate covariate. The total MAE and MAPE for the combined YLD rate were 0.0007 and 0.5949%, respectively. The MAE and MAPE of the combined HALE in the 0-year-old group were 0.0341 and 0.0526%, respectively. There were slightly fewer males (0.0197, 0.0311%) than females (0.0501, 0.0755%). Conclusion: We constructed a high-accuracy model to measure the YLD rate in China by using three monitoring indicators from the Chinese national routine as predictor variables. The model provides a realistic and feasible solution for measuring HALE at the national and especially regional levels, considering limited data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Multi-objective calibration and uncertainty analysis for the event-based modelling of flash floods.
- Author
-
Usman, Muhammad Nabeel, Leandro, Jorge, Broich, Karl, and Disse, Markus
- Subjects
- *
CALIBRATION , *STANDARD deviations , *ROBUST optimization , *FLOODS - Abstract
This study investigates the best approach to calibrate an event-based conceptual Hydrologiska Byråns Vattenbalansavdelning (HBV) model, comparing different trials of single-objective, single-event multi-objective (SEMO), and multi-event-multi-objective (MEMO) model calibrations using root mean squared error (RMSE), Nash-Sutcliffe model efficiency coefficient (NSE), and Bias as objective functions. Model performance was validated for several peak events via 90% confidence interval (CI)-based output uncertainty quantification of relative error of discharges. Multi-objective optimization yielded more accurate and robust solutions compared to single-objective calibrations. Ensembles of Pareto solutions from the multi-objective calibrations better characterized the flood peaks within the uncertainty intervals. MEMO calibration exhibited lower uncertainties and better prediction of peak events versus SEMO calibration. Moreover, the MEMO_6D (six-dimensional) approach outperformed the SEMO_3D and MEMO_3D in capturing the larger peak events. This study suggests that the MEMO_6D is the best approach for predicting large flood events with lower model output uncertainties when the calibration is performed with a better combination of peak events. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. A Fuzzy Logic Approach to Remaining Useful Life Estimation of Ball Bearings
- Author
-
Witczak, Marcin, Lipiec, Bogdan, Mrugalski, Marcin, Stetter, Ralf, Kacprzyk, Janusz, Series Editor, Pal, Nikhil R., Advisory Editor, Bello Perez, Rafael, Advisory Editor, Corchado, Emilio S., Advisory Editor, Hagras, Hani, Advisory Editor, Kóczy, László T., Advisory Editor, Kreinovich, Vladik, Advisory Editor, Lin, Chin-Teng, Advisory Editor, Lu, Jie, Advisory Editor, Melin, Patricia, Advisory Editor, Nedjah, Nadia, Advisory Editor, Nguyen, Ngoc Thanh, Advisory Editor, Wang, Jun, Advisory Editor, Bartoszewicz, Andrzej, editor, and Kabziński, Jacek, editor
- Published
- 2020
- Full Text
- View/download PDF
6. General noise support vector regression with non-constant uncertainty intervals for solar radiation prediction
- Author
-
J. PRADA and J. R. DORRONSORO
- Subjects
Support vector regression ,General noise model ,Naive online R minimization algorithm (NORMA) ,Uncertainty intervals ,Clustering ,Solar energy ,Production of electric energy or power. Powerplants. Central stations ,TK1001-1841 ,Renewable energy sources ,TJ807-830 - Abstract
Abstract General noise cost functions have been recently proposed for support vector regression (SVR). When applied to tasks whose underlying noise distribution is similar to the one assumed for the cost function, these models should perform better than classical $$\epsilon$$ ϵ -SVR. On the other hand, uncertainty estimates for SVR have received a somewhat limited attention in the literature until now and still have unaddressed problems. Keeping this in mind, three main goals are addressed here. First, we propose a framework that uses a combination of general noise SVR models with naive online R minimization algorithm (NORMA) as optimization method, and then gives non-constant error intervals dependent upon input data aided by the use of clustering techniques. We give theoretical details required to implement this framework for Laplace, Gaussian, Beta, Weibull and Marshall–Olkin generalized exponential distributions. Second, we test the proposed framework in two real-world regression problems using data of two public competitions about solar energy. Results show the validity of our models and an improvement over classical $$\epsilon$$ ϵ -SVR. Finally, in accordance with the principle of reproducible research, we make sure that data and model implementations used for the experiments are easily and publicly accessible.
- Published
- 2018
- Full Text
- View/download PDF
7. General Noise SVRs and Uncertainty Intervals
- Author
-
Prada, Jesus, Dorronsoro, Jose Ramon, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Rojas, Ignacio, editor, Joya, Gonzalo, editor, and Catala, Andreu, editor
- Published
- 2017
- Full Text
- View/download PDF
8. Interval Analysis of the Loss of Life Caused by Dam Failure.
- Author
-
Ge, Wei, Wang, Xiuwei, Li, Zongkun, Zhang, Hexiang, Guo, Xinyan, Wang, Te, Gao, Weixing, Lin, Chaoning, and van Gelder, Pieter
- Subjects
- *
DAM failures , *INTERVAL analysis , *AGE distribution , *SOCIAL factors , *MAGNITUDE (Mathematics) - Abstract
Both hydrodynamic factors and social factors have large impacts on the loss of life caused by dam failure. Relatively large uncertainty intervals of the influencing factors lead to changes in the potential loss of life. Based on an analysis of the formation mechanism of loss of life, the influencing factors were identified. Combined with interval theory, a method for calculating loss of life and determining the impacts of the influencing factors on loss of life was proposed. The intervals of the exposure rate of the population at risk and the mortality of the exposed population, which are impacted by the major influencing factors such as the flood severity, warning time, understanding of dam failure, and building vulnerability, were recommended. Furthermore, a range of correction coefficients caused by the minor influencing factors, such as the dam failure time, rescue ability, and age distribution, was analyzed. The proposed method was validated by analyzing the losses of life in 21 flooded regions after 10 dam failure events and 2 flash river floods, in which the intervals of the estimated results all contained the actual loss of life. In addition, the ratios of the upper bounds to the corresponding lower bounds of the intervals were all less than 10, which is in accordance with the characteristic that the results of different existing methods vary within an order of magnitude. This is the first work that pays careful attention to the uncertainty intervals of loss of life estimates, and the proposed method effectively determined the severity of the potential loss of life caused by dam failure. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
9. One-class model with two decision thresholds for the rapid detection of cashew nuts adulteration by other nuts
- Author
-
Química Analítica i Química Orgànica, Universitat Rovira i Virgili, Rovira G; Miaw CSW; Martins MLC; Sena MM; de Souza SVC; Callao MP; Ruisánchez I, Química Analítica i Química Orgànica, Universitat Rovira i Virgili, and Rovira G; Miaw CSW; Martins MLC; Sena MM; de Souza SVC; Callao MP; Ruisánchez I
- Abstract
A green screening method to determine cashew nut adulteration with Brazilian nut, pecan nut, macadamia nut and peanut was proposed. The method was based on the development of a one-class soft independent modelling of class analogy (SIMCA) model for non-adulterated cashew nuts using near-infrared (NIR) spectra obtained with portable equipment. Once the model is established, the assignment of unknown samples depends on the threshold established for the authentic class, which is a key aspect in any screening approach. The authors propose innovatively to define two thresholds: lower model distance limit and upper model distance limit. Samples with distances below the lower threshold are assigned as non-adulterated with a 100% probability; samples with distance values greater than the upper threshold are assigned as adulterated with a 100% probability; and samples with distances within these two thresholds will be considered uncertain and should be submitted to a confirmatory analysis. Thus, the possibility of error in the sample assignment significantly decreases. In the present study, when just one threshold was defined, values greater than 95% for the optimized threshold were obtained for both selectivity and specificity. When two class thresholds were defined, the percentage of samples with uncertain assignment changes according to the adulterant considered, highlighting the case of peanuts, in which 0% of uncertain samples was obtained. Considering all adulterants, the number of samples that were submitted to a confirmatory analysis was quite low, 5 of 224 adulterated samples and 3 of 56 non-adulterated samples.
- Published
- 2023
10. Robust UC model based on multi‐band uncertainty set considering the temporal correlation of wind/load prediction errors.
- Author
-
Chen, Yanbo, Zhang, Zhi, Chen, Hao, and Zheng, Huiping
- Abstract
With the increasing proportion of wind power connected to grid, power system dispatching is facing more and more challenges from uncertainty. To cope with this uncertainty, robust optimization has been applied in unit commitment (UC) problem. In this paper, a multi‐band uncertainty set considering the temporal correlation (MBUSCTC) of wind/load prediction error is proposed firstly, which has two characteristics: (1) The MBUSCTC rigorously and realistically reflect the distribution characteristics of uncertainties in uncertainty intervals, thereby effectively reducing the conservatism of the traditional singe‐band uncertainty set; (2) the temporal correlation constraints of wind power/load prediction errors in MBUSCTC could limit the realization of uncertainties fluctuating frequently in uncertain intervals, thereby eliminating scenarios with lower probability in uncertainty sets. Then the proposed MBUSCTC is applied to UC problem, leading a robust UC model based on MBUSCTC, which is solved by Benders decomposition method and C&CG method. Finally, case studies based on the modified IEEE‐118 bus system and an actual power system of China demonstrate that the proposed method can effectively reduce the conservativeness of the robust UC model and ensure the robustness of the unit commitment solution. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
11. Causal inference accounting for unobserved confounding after outcome regression and doubly robust estimation.
- Author
-
Genbäck, Minna and de Luna, Xavier
- Subjects
- *
INGESTION , *UNCERTAINTY , *HYPOTHESIS - Abstract
Causal inference with observational data can be performed under an assumption of no unobserved confounders (unconfoundedness assumption). There is, however, seldom clear subject‐matter or empirical evidence for such an assumption. We therefore develop uncertainty intervals for average causal effects based on outcome regression estimators and doubly robust estimators, which provide inference taking into account both sampling variability and uncertainty due to unobserved confounders. In contrast with sampling variation, uncertainty due to unobserved confounding does not decrease with increasing sample size. The intervals introduced are obtained by modeling the treatment assignment mechanism and its correlation with the outcome given the observed confounders, allowing us to derive the bias of the estimators due to unobserved confounders. We are thus also able to contrast the size of the bias due to violation of the unconfoundedness assumption, with bias due to misspecification of the models used to explain potential outcomes. This is illustrated through numerical experiments where bias due to moderate unobserved confounding dominates misspecification bias for typical situations in terms of sample size and modeling assumptions. We also study the empirical coverage of the uncertainty intervals introduced and apply the results to a study of the effect of regular food intake on health. An R‐package implementing the inference proposed is available. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
12. Uncertainty Interval Expression of Measurement: Possibility Maximum Specificity versus Probability Maximum Entropy Principles
- Author
-
Mauris, Gilles, Hüllermeier, Eyke, editor, Kruse, Rudolf, editor, and Hoffmann, Frank, editor
- Published
- 2010
- Full Text
- View/download PDF
13. Influence of scenario uncertainty in agricultural inputs on life cycle greenhouse gas emissions from agricultural production systems: the case of chemical fertilizers in Japan.
- Author
-
Hayashi, Kiyotada, Makino, Naoki, Shobatake, Koichi, and Hokazono, Shingo
- Subjects
- *
AGRICULTURAL productivity , *PRODUCT life cycle , *GREENHOUSE gases , *FERTILIZERS , *UNCERTAINTY (Information theory) , *ECONOMICS - Abstract
Abstract: Practical applications of life cycle assessment (LCA) to agricultural production systems require articulating uncertainties caused by scenario indeterminacy, because practitioners do not have sufficient knowledge about agricultural input production processes. However, current understanding about scenario uncertainties is still limited on account of insufficient knowledge. We propose a method to quantify scenario uncertainty in agricultural inputs and to assess the uncertainty in comparative LCA of agricultural production systems. We formulate uncertainty intervals due to scenario indeterminacy (a situation in which decision makers face decision problems without determining the details of the scenarios) and derive uncertainty intervals for conventional, environmentally friendly, and organic rice production systems in Japan, in addition to those for chemical fertilizer production and distribution. The result indicates that, although lack of information on transportation scenarios caused uncertainty, the ranking of global warming potential for conventional, environmentally friendly, and organic rice production systems remained unchanged. It implies that the result is stable with respect to scenario uncertainties due to location of fertilizer production. The proposed methodology for understanding the stability of results can be further developed as a technique to deal with uncertainty and instability in LCA of agricultural production systems. [Copyright &y& Elsevier]
- Published
- 2014
- Full Text
- View/download PDF
14. A Review of Relationships Between Possibility and Probability Representations of Uncertainty in Measurement.
- Author
-
Mauris, Gilles
- Subjects
- *
PARAMETER estimation , *PROBABILITY theory , *DISPERSION (Chemistry) , *MAXIMUM entropy method , *UNCERTAINTY (Information theory) , *A priori - Abstract
The main advances regarding the deep connections between probability and possibility measurement uncertainty representation (but not the propagation) over the last decade are reviewed. They concern the following: the definition of a possibility distribution equivalent to a probability of one from its whole set of dispersion intervals about one point for all of the probability levels, the bridges with the conventional dispersion parameters, the representation of a partial probability knowledge owing to a maximum specificity principle better than the maximum entropy principle, and also probability inequalities. The use of a possibility representation for common measurement situations such as the description of measurement results, measurand estimation, and expression of a priori uncertainty information is illustrated and then discussed in view of their use in further processing (propagation and fuzzy inference systems). The conclusion highlights the interests of the possibility approach and points out some remaining issues. [ABSTRACT FROM PUBLISHER]
- Published
- 2013
- Full Text
- View/download PDF
15. THE NATIONAL BANK OF ROMANIA INFLATION FORECASTS BASED ON ECONOMETRIC MODELS ARE MORE ACCURATE THAN THE TARGET INFLATION.
- Author
-
Marin, Erika and Bratu, Mihaela
- Subjects
INFLATION forecasting ,ECONOMETRIC models ,MACROECONOMICS ,ECONOMIC models - Abstract
The objective of this research is to show that National Bank of Romania follow the international pattern by providing inflation rate forecasts based on its own model better than the target inflation. Starting from quarterly values for the annual inflation, for 2012 the forecasts based on the institution macro-econometric models were more accurate than the annual target fixed for each quarter. The accuracy of inflation targets made for 2013 was evaluated in ex-ante variant, choosing as benchmark forecasts those provided by NBR and the na?ve ones. This study introduces as a novelty in literature some measures of accuracy and it proposes the evaluation of accuracy for uncertainty intervals using only the lower, respectively the upper limit of each forecast interval. Only with some exceptions the errors based on the inferior limit of uncertainty intervals proposed by NBR are smaller than those computed using the superior boundaries as point forecasts. In ex-ante variant, for 2013 the targets for this year and the NBR forecasts based on econometric models were chosen as possible realizations. If the targeted inflation is considered as the real value of inflation in the first two quarters of 2013 the upper limits of intervals are recommended to be chosen unlike the inferior boundaries for the third and the fourth quarters from 2013.This paper is an original research not only for assessing NBR forecasts accuracy, but also for the proposal of new methods of evaluating the accuracy for point forecasts and uncertainty intervals. [ABSTRACT FROM AUTHOR]
- Published
- 2013
16. ESTRATEGIA PARA REDUCIR INTERVALOS DE INCERTIDUMBRE APLICADA EN LOCALIZACIÓN DE FALLAS EN SISTEMAS DE DISTRIBUCIÓN.
- Author
-
Orozco Henao, César Augusto, Mora Flórez, Juan José, and Pérez Londoño, Sandra Milena
- Subjects
ALGORITHMS ,MATHEMATICAL ability ,NUMERICAL analysis ,MATHEMATICAL programming ,FAULT location (Engineering) ,IMPEDANCE control ,POWER distribution networks - Abstract
Copyright of Ciencia e Ingenieria Neogranadina is the property of Ciencia e Ingenieria Neogranadina and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2012
- Full Text
- View/download PDF
17. AR LATTICE ℓ-STEP AHEAD OUTPUT UNCERTAINTY PREDICTION SCHEME WITH UNCERTAINTY INTERVALS.
- Author
-
NIKOLAKOPOULOS, GEORGE and TZES, ANTHONY
- Subjects
- *
ALGORITHMS , *BOX-Jenkins forecasting , *COMPUTER engineering , *ELECTRICAL engineering , *ENGINEERING - Abstract
In this paper, a lattice ℓ-step ahead output uncertainty prediction algorithm is presented. The proposed prediction scheme is applicable to linear, stable, auto-regressive (AR) systems. The uncertainty predictors utilize the structure of the lattice filters. Subject to the saturation limits of the excitation signal, the measurement of the system output, and the a priori bounds of the reflection coefficients, the set of the feasible predicted output variables is computed. The suggested scheme is recursively computed over an ℓ-step ahead future time window, while the prediction uncertainty intervals are similarly computed over the lattice stages, regarding the forward and the backward prediction errors. Simulation results are presented that prove the efficacy of the employed lattice uncertainty predictors. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
18. Intercomparison of flatness measurements of an optical flat at apertures of up to 150 mm in diameter
- Author
-
M Schulz, L Svedova, M Vannoni, D Putland, S Quabis, R Bergmans, F Hungwe, Zbigniew Ramotowski, Petr Křen, M Pérez, Y Kang, P. Balling, E. Prieto, A Küng, M Asar, Antti Lassila, H Piree, G Ehret, and D Williams
- Subjects
uncertainty intervals ,Aperture ,deflectometry ,Flatness (systems theory) ,02 engineering and technology ,01 natural sciences ,law.invention ,010309 optics ,Optics ,0203 mechanical engineering ,flatness measurements ,law ,0103 physical sciences ,Optical surface ,Optical flat ,EURAMET ,uncertainty analysis ,Mathematics ,lateral resolution ,business.industry ,General Engineering ,interferometry ,Metrology ,020303 mechanical engineering & transports ,comparison ,national metrology institutes ,Measuring instrument ,measuring instruments ,business - Abstract
Recently, a scientific comparison of flatness measuring instruments at European National Metrology Institutes (NMIs) was performed in the framework of EURAMET. The specimen was a well-polished optical surface with a maximum measurement aperture of 150 mm in diameter. Here, we present an evaluation concept, which allows the determination of a mean flatness map taking into account different lateral resolutions of the instruments and different orientations of the specimen during measurement. We found that all measurements are in agreement with the mean flatness map within the uncertainty intervals stated by the participants. The aim of this scientific comparison is to specify an appropriate operation and evaluation procedure for future comparisons.
- Published
- 2017
- Full Text
- View/download PDF
19. Approximation of the Feasible Parameter Set in worst-case identification of Hammerstein models
- Author
-
Falugi, P., Giarré, L., and Zappa, G.
- Subjects
- *
MATHEMATICAL optimization , *REASONING , *SYSTEMS theory , *NONLINEAR systems - Abstract
Abstract: The estimation of the Feasible Parameter Set (FPS) for Hammerstein models in a worst-case setting is considered. A bounding procedure is determined both for polytopic and ellipsoidic uncertainties. It consists in the projection of the FPS of the extended parameter vector onto suitable subspaces and in the solution of convex optimization problems which provide Uncertainties Intervals of the model parameters. The bounds obtained are tighter than in the previous approaches. [Copyright &y& Elsevier]
- Published
- 2005
- Full Text
- View/download PDF
20. Data Vagueness and Uncertainties in Urban Heavy-Metal Data Collection.
- Author
-
Hedbrant, J. and Sörme, L.
- Subjects
CITIES & towns ,ENVIRONMENTAL monitoring ,HEAVY metals ,POLLUTION measurement ,DATA analysis ,UNCERTAINTY ,INFORMATION services - Abstract
The use of societal data in environmental research has indicated a need for considering uncertainties of data. Several fundamental conditions for statistical treatment are occasionally not met. The choice is either to use or to ignore the uncertain data. If used, it may impair the quality of the result. If ignored, a possible environmental risk may remain unattended. This article discusses some of the problems encountered with data in urban heavy-metal metabolism, and suggests a method based on uncertainty intervals to consider the uncertainties. [ABSTRACT FROM AUTHOR]
- Published
- 2001
- Full Text
- View/download PDF
21. Intercomparison of flatness measurements of an optical flat at apertures of up to 150 mm in diameter
- Subjects
lateral resolution ,uncertainty intervals ,ta114 ,ta213 ,flatness measurements ,comparison ,deflectometry ,national metrology institutes ,EURAMET ,measuring instruments ,uncertainty analysis ,ta218 - Published
- 2017
- Full Text
- View/download PDF
22. Methods for longitudinal brain imaging studies with dropout
- Author
-
Gorbach, Tetiana and Gorbach, Tetiana
- Abstract
One of the challenges in aging research is to understand the brain mechanisms that underlie cognitive development in older adults. Such aging processes are investigated in longitudinal studies, where the within-individual changes over time are observed. However, several methodological issues exist in longitudinal analyses. One of them is loss of participants to follow-up, which occurs when individuals drop out from the study. Such dropout should be taken into account for valid conclusions from longitudinal investigations, and this is the focus of this thesis. The developed methods are used to explore brain aging and its relation to cognition within the Betula longitudinal study of aging. Papers I and II consider the association between changes in brain structure and cognition. In the first paper, regression analysis is used to establish the statistical significance of brain-cognition associations while accounting for dropout. Paper II develops interval estimators directly for an association as measured by partial correlation, when some data are missing. The estimators of Paper II may be used in longitudinal as well as cross-sectional studies and are not limited to brain imaging. Papers III and IV study functional brain connectivity, which is the statistical dependency between the functions of distinct brain regions. Typically, only brain regions with associations stronger than a predefined threshold are considered connected. However, the threshold is often arbitrarily set and does not reflect the individual differences in the overall connectivity patterns. Paper III proposes a mixture model for brain connectivity without explicit thresholding of associations and suggests an alternative connectivity measure. Paper IV extends the mixture modeling of Paper III to a longitudinal setting with dropout and investigates the impact of ignoring the dropout mechanism on the quality of the inferences made on longitudinal connectivity changes.
- Published
- 2019
23. Metoder för longitudinella hjärnavbildningsstudier med bortfall
- Author
-
Gorbach, Tetiana
- Subjects
cognition ,uncertainty intervals ,sensitivity analysis ,Missing data ,brain structure ,resting-state functional connectivity ,aging ,Sannolikhetsteori och statistik ,nonignorable dropout ,Probability Theory and Statistics ,pattern-mixture models ,MRI - Abstract
One of the challenges in aging research is to understand the brain mechanisms that underlie cognitive development in older adults. Such aging processes are investigated in longitudinal studies, where the within-individual changes over time are observed. However, several methodological issues exist in longitudinal analyses. One of them is loss of participants to follow-up, which occurs when individuals drop out from the study. Such dropout should be taken into account for valid conclusions from longitudinal investigations, and this is the focus of this thesis. The developed methods are used to explore brain aging and its relation to cognition within the Betula longitudinal study of aging. Papers I and II consider the association between changes in brain structure and cognition. In the first paper, regression analysis is used to establish the statistical significance of brain-cognition associations while accounting for dropout. Paper II develops interval estimators directly for an association as measured by partial correlation, when some data are missing. The estimators of Paper II may be used in longitudinal as well as cross-sectional studies and are not limited to brain imaging. Papers III and IV study functional brain connectivity, which is the statistical dependency between the functions of distinct brain regions. Typically, only brain regions with associations stronger than a predefined threshold are considered connected. However, the threshold is often arbitrarily set and does not reflect the individual differences in the overall connectivity patterns. Paper III proposes a mixture model for brain connectivity without explicit thresholding of associations and suggests an alternative connectivity measure. Paper IV extends the mixture modeling of Paper III to a longitudinal setting with dropout and investigates the impact of ignoring the dropout mechanism on the quality of the inferences made on longitudinal connectivity changes.
- Published
- 2019
24. A Review of Relationships Between Possibility and Probability Representations of Uncertainty in Measurement
- Author
-
G. Mauris, Laboratoire d'Informatique, Systèmes, Traitement de l'Information et de la Connaissance (LISTIC), and Université Savoie Mont Blanc (USMB [Université de Savoie] [Université de Chambéry])
- Subjects
possibility theory: probability theory ,Probability box ,A priori probability ,uncertainty intervals ,Principle of maximum entropy ,020208 electrical & electronic engineering ,Applied probability ,Probability and statistics ,[STAT.TH]Statistics [stat]/Statistics Theory [stat.TH] ,02 engineering and technology ,Empirical probability ,[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing ,[MATH.MATH-ST]Mathematics [math]/Statistics [math.ST] ,0202 electrical engineering, electronic engineering, information engineering ,Calculus ,Probability distribution ,020201 artificial intelligence & image processing ,Electrical and Electronic Engineering ,uncertainty ,parameter estimation ,[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processing ,Instrumentation ,Algorithm ,Possibility theory ,Mathematics - Abstract
International audience; The main advances regarding the deep connections between probability and possibility measurement uncertainty representation (but not the propagation) over the last decade are reviewed. They concern: the definition of a possibility distribution equivalent to a probability one from its whole set of dispersion intervals for all the probability levels, the bridges with the conventional dispersion parameters, the representation of a partial probability knowledge thanks to a maximum specificity principle better than the maximum entropy principle and also related to probability inequalities. The use of a possibility representation for common measurement situations such as the description of measurement results, measurand estimation and the expression of a priori uncertainty information are illustrated, and then discussed in view of their use in further processing (propagation, fuzzy inference systems). The conclusion highlights the interests of the possibility approach and points out some remaining issues.
- Published
- 2013
- Full Text
- View/download PDF
25. Uncertainty intervals and sensitivity analysis for missing data
- Author
-
Genbäck, Minna and Genbäck, Minna
- Abstract
In this thesis we develop methods for dealing with missing data in a univariate response variable when estimating regression parameters. Missing outcome data is a problem in a number of applications, one of which is follow-up studies. In follow-up studies data is collected at two (or more) occasions, and it is common that only some of the initial participants return at the second occasion. This is the case in Paper II, where we investigate predictors of decline in self reported health in older populations in Sweden, the Netherlands and Italy. In that study, around 50% of the study participants drop out. It is common that researchers rely on the assumption that the missingness is independent of the outcome given some observed covariates. This assumption is called data missing at random (MAR) or ignorable missingness mechanism. However, MAR cannot be tested from the data, and if it does not hold, the estimators based on this assumption are biased. In the study of Paper II, we suspect that some of the individuals drop out due to bad health. If this is the case the data is not MAR. One alternative to MAR, which we pursue, is to incorporate the uncertainty due to missing data into interval estimates instead of point estimates and uncertainty intervals instead of confidence intervals. An uncertainty interval is the analog of a confidence interval but wider due to a relaxation of assumptions on the missing data. These intervals can be used to visualize the consequences deviations from MAR have on the conclusions of the study. That is, they can be used to perform a sensitivity analysis of MAR. The thesis covers different types of linear regression. In Paper I and III we have a continuous outcome, in Paper II a binary outcome, and in Paper IV we allow for mixed effects with a continuous outcome. In Paper III we estimate the effect of a treatment, which can be seen as an example of missing outcome data.
- Published
- 2016
26. Justification and numerical realization of the uniform method for finding point estimates of interval elicited scaling constants
- Author
-
Tenekedjiev, Kiril and Nikolova, Natalia
- Published
- 2008
- Full Text
- View/download PDF
27. Critical usages extraction from historical and heterogénius data in order to optimize fleet maintenance
- Author
-
Ben Zakour, Asma, Laboratoire Bordelais de Recherche en Informatique (LaBRI), Université de Bordeaux (UB)-Centre National de la Recherche Scientifique (CNRS)-École Nationale Supérieure d'Électronique, Informatique et Radiocommunications de Bordeaux (ENSEIRB), Université Sciences et Technologies - Bordeaux I, Mohamed Mosbah, Sofian Maabout, and Labri, Documentaliste
- Subjects
Uncertainty intervals ,Séquences fréquentes ,Contraintes temporelles ,Extraction ,Maintenance aéronautique ,[INFO] Computer Science [cs] ,Sliding window ,Fenêtre glissante ,Exploration de données ,Prévision) ,Temporal constraints ,Aéronefs (Entretien et réparations ,Prévision de la maintenance ,Intervalles d'incertitude ,Aeronautic maintenance ,Maintenance prognostic ,[INFO]Computer Science [cs] ,Frequent sequences extraction - Abstract
The present work is part of an industrial project driven by 2MoRO Solutions company.It aims to develop a high value service enabling aircraft operators to optimize their maintenance actions.Given the large amount of data available around aircraft exploitation, we aim to analyse the historical events recorded with each aircraft in order to extract maintenance forecasting. Theresults are used to integrate and consolidate maintenance tasks in order to minimize aircraft downtime and risk of failure. The proposed method involves three steps : (i) streamlining information in order to combinethem, (ii) organizing this data for easy analysis and (iii) an extraction step of useful knowledgein the form of interesting sequences., Le travail produit s'inscrit dans un cadre industriel piloté par la société 2MoRO Solutions. La réalisation présentée dans cette thèse doit servir à l'élaboration d'un service à haute valeur, permettant aux exploitants aéronautiques d'optimiser leurs actions de maintenance. Les résultats obtenus permettent d'intégrer et de regrouper les tâches de maintenance en vue de minimiser la durée d'immobilisation des aéronefs et d'en réduire les risques de panne.La méthode que nous proposons comporte trois étapes : (i) une étape de rationalisation des séquences afin de pouvoir les combiner
- Published
- 2012
28. Uncertainty of forecasted new service market capacity obtained by logistic model
- Author
-
Sokele, Mladen, Garcia-Ferrer, Antonio, and Hamoudia, Mohsen
- Subjects
Market capacity forecasting ,Logistic model ,Uncertainty intervals - Abstract
Logistic model of growth is widely used model for technological and market development forecasting because of its many useful properties. In telecommunications, logistic model is used as a quantitative forecasting method for the new service market adoption when interaction with other services can be neglected. Growth forecasting relies on hypothesis that extrapolation of model, which is fitted to known data points, will be valid in the perceivable future. Thus, parameters of the model as well as forecasted values are sensitive to the accuracy of input data points. In general, logistic model parameter determination requires the application of an iterative numerical method, which complicates direct assessment of model sensitivity to uncertainty of input data. Namely, the uncertainty in the determination of telecommunications service market capacity is of great concern to operators. Presented analytical procedure for direct logistic model parameter determination in case of equidistant time points gives valuable basis for the full sensitivity analysis. Expressions and contour graphs showing dependence of forecasted market capacity on uncertainty of input data are obtained by the total differentiation approach. In addition, required conditions and analysis of practical cases with input data uncertainty influence on forecasting results are presented.
- Published
- 2008
29. Data Vagueness and Uncertainties in Urban Heavy-Metal Data Collection
- Author
-
Sörme, Louise, Hedbrant, Johan, Sörme, Louise, and Hedbrant, Johan
- Abstract
The use of societal data in environmental research has indicated a need for considering uncertainties of data. Several fundamental conditions for statistical treatment are occasionally not met. The choice is either to use or to ignore the uncertain data. If used, it may impair the quality of the result. If ignored, a possible environmental risk may remain unattended. This article discusses some of the problems encountered with data in urban heavy-metal metabolism, and suggests a methodbased on uncertainty intervals to consider the uncertainties.
- Published
- 2001
- Full Text
- View/download PDF
30. OMNE: A new robust membership-set estimator for the parameters of nonlinear models
- Author
-
Lahanier, Hélène, Walter, Eric, and Gomeni, Roberto
- Published
- 1987
- Full Text
- View/download PDF
31. Uncertainty intervals for unobserved confounding of direct and indirect effects with extensions to censoring and truncation
- Author
-
Lindmark, Anita and Lindmark, Anita
- Abstract
When performing a mediation analysis, i.e. estimating direct and indirect effects of a given exposure on an outcome, strong assumptions are made about unconfoundedness. These assumptions are difficult to verify in a given situation and therefore a mediation analysis should be complemented with a sensitivity analysis to assess the impact of violations. Lindmark et al. (2016) proposed a sensitivity analysis method for parametric estimation of conditional and marginal direct and indirect effects when the mediator and outcome are binary and modeled using probit models. In this paper we extend this to include cases with continuous mediators and outcomes and suggest extensions to the cases when the continuous outcome variable is censored or truncated. Three sensitivity parameters are used, consisting of the correlations between the error terms of the mediator, outcome and exposure assignment mechanism models. These correlations are incorporated into the estimation of the model parameters and sampling variability is taken into account through the construction of uncertainty intervals.
32. Presentation of Ordinal Regression Analysis on the Original Scale
- Author
-
Hannah, Murray and Quigley, Paul
- Published
- 1996
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.