77 results on '"Bayesian Decision Theory"'
Search Results
2. The value of monitoring a structural health monitoring system
- Author
-
Pier Francesco Giordano, Said Quqa, and Maria Pina Limongelli
- Subjects
Value of information ,Structural health monitoring ,Bayesian decision theory ,Sensor fault ,Data quality ,Building and Construction ,Safety, Risk, Reliability and Quality ,Civil and Structural Engineering - Published
- 2023
3. The Benefit of Informed Risk-Based Management of Civil Infrastructures
- Author
-
Maria pina Limongelli and Pier Francesco Giordano
- Subjects
General Materials Science ,Bayesian decision theory ,value of information ,structural health monitoring ,decision-making ,bridge management ,earthquake ,flood ,scour ,Building and Construction ,Geotechnical Engineering and Engineering Geology ,Computer Science Applications ,Civil and Structural Engineering - Abstract
One of the most interesting applications of Structural Health Monitoring (SHM) is the possibility of providing real-time information on the conditions of civil infrastructures during and following disastrous events, thus supporting decision-makers in prompt emergency operations. The Bayesian decision theory provides a rigorous framework to quantify the benefit of SHM through the Value of Information (VoI) accounting for different sources of uncertainties. This decision theory is based on utility considerations, or, in other words, it is based on risk. Instead, decision-making in emergency management is often based on engineering judgment and heuristic approaches. The goal of this paper is to investigate the impact of different decision scenarios on the VoI. To this aim, a general framework to quantify the benefit of SHM information in emergency management is applied to different decision scenarios concerning bridges under scour and seismic hazards. Results indicate that the considered decision scenario might tremendously affect the results of a VoI analysis. Specifically, the benefit of SHM information could be underestimated when considering non-realistic scenarios, e.g., those based on risk-based decision-making, which are not adopted in practice. Besides, SHM information is particularly valuable when it prevents the selection of suboptimal emergency management actions.
- Published
- 2022
- Full Text
- View/download PDF
4. Value of information analysis in civil and infrastructure engineering: a review
- Author
-
Da-Gang Lu, Sebastian Thöns, Wei-Heng Zhang, Michael Havbro Faber, and Jianjun Qin
- Subjects
021110 strategic, defence & security studies ,Value of information ,Point (typography) ,Bayesian decision theory ,Computer science ,business.industry ,Utility theory ,Big data ,0211 other engineering and technologies ,Structural integrity ,Engineering risk analysis ,020101 civil engineering ,02 engineering and technology ,Engineering (General). Civil engineering (General) ,Data science ,Field (computer science) ,0201 civil engineering ,Communication theory ,Optimum decision making ,Structural health information ,Structural health monitoring ,TA1-2040 ,business - Abstract
The concept of Value of Information (VoI) has attracted significant attentions within the civil engineering community over especially the last decade. Triggered by the increasing focus on structural health monitoring, availability of data and emerging techniques of Big Data analysis and Artificial Intelligence, important insights on how to take benefit from VoI in structural integrity management have been gained. This literature review starts out with a summary of the historical developments and contains (1) a summary of two different VoI analysis origins, (2) a compilation of existing VoI analyses research and (3) current engineering interpretations and applications of VoI in the field of civil and infrastructure engineering. VoI analysis has roots in communication theory and Bayesian decision analysis in conjunction with utility theory. Starting point is thus taken in brief introduction of these theoretical foundations, followed by a discussion on the relevant modelling aspects such as information, probability and utility modelling. A detailed review of relevant existing research is presented, divided into the following main areas: computational methods, optimal sensor placement and engineering risk management. Finally, by way of conclusion and outlook, challenges and some promising directions for VoI analysis in the field of civil and infrastructure engineering are identified.
- Published
- 2021
5. Feature Fusion Based on Bayesian Decision Theory for Radar Deception Jamming Recognition
- Author
-
Ruowu Wu, Hongping Zhou, Zhongyi Guo, Xiong Xu, and Chengcheng Dong
- Subjects
Bayesian decision theory ,General Computer Science ,Computer science ,media_common.quotation_subject ,Feature extraction ,Jamming ,02 engineering and technology ,law.invention ,law ,0202 electrical engineering, electronic engineering, information engineering ,feature fusion ,General Materials Science ,Electronic warfare ,Radar ,media_common ,deception jamming ,business.industry ,General Engineering ,020206 networking & telecommunications ,Pattern recognition ,Deception ,Range gate ,Radar jamming and deception ,020201 artificial intelligence & image processing ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,Artificial intelligence ,kernel density estimation ,business ,lcsh:TK1-9971 ,Bispectrum ,bispectrum transformation - Abstract
As an important part of electronic warfare, radar countermeasure determines the trend of war to a large extent. Modern radar jamming technology, especially deception jamming technology, plays an increasingly important role. Therefore, how to identify radar deception jamming is very necessary. In this paper, a feature fusion algorithm based on Bayesian decision theory is used to recognize radar deception jamming signals. Firstly, the real echo signal, deception jamming signal (contains range gate pull-off jamming, velocity gate pull-off jamming and range-velocity gate pull-off jamming) and noise signal received by radar are acquired as signal sources. Then bispectrum transformation is used to extract features in several aspects. Finally, kernel density estimation is used to improve the fusion algorithm, and the feature fusion algorithm based on Bayesian decision theory is used to recognize the received signals from radar. Results of the experiment indicate that the algorithm not only can recognize the radar deception jamming, but also has high accuracy.
- Published
- 2021
6. A Bayesian Account of Depth from Shadow
- Author
-
Elder, James, Cavanagh, Patrick, and Casati, Roberto
- Subjects
Computational Neuroscience ,Bayesian decision theory ,depth ,Cognitive Neuroscience ,ecological statistics ,Shadows - Abstract
When an object casts a shadow on a background surface, the offset of the shadow can be a compelling cue to the relative depth between the object and the background (e.g., Kersten et al 1996, Fig. 1). Cavanagh et al (2021) found that, at least for small shadow offsets, perceived depth scales almost linearly with shadow offset. Here we ask whether this finding can be understood quantitatively in terms of Bayesian decision theory. Estimating relative depth from shadow offset is complicated by the fact that the shadow offset is co-determined by the slant of the light source relative to the background. Since this is often difficult or impossible to estimate directly, the observer must employ priors for both the relative depth and the light source slant. To establish an ecological prior for relative depth, we employed the SYNS dataset (Adams et al., 2016) and the methods of Ehinger et al (2017) to measure the distribution of relative depths at depth edges near the horizon (Fig. 2). Lacking comparable empirical statistics for illumination slant, we considered two possible distributions: A zero-parameter uniform distribution, and a two-parameter beta distribution. To model the human data, we assumed that the visual system makes use of these priors and the observed shadow offset to minimize expected squared error in perceived relative depth. Fig. 3 shows that while the empirical depth prior brings the model into the range of the human data, a flat illumination prior predicts a more compressive scaling than observed. Fitting a beta distribution to minimize weighted squared deviation between human and optimal depth judgements corrects this deviation, and predicts a broadly peaked distribution over illumination slant, peaking at 37.4 deg away from the surface normal (Fig. 4). We will discuss possible ecological explanations for this illumination prior.
- Published
- 2022
7. Decisions on life-cycle reliability of flood defence systems
- Author
-
Klerk, W.J. and Delft University of Technology
- Subjects
reinforcement ,reliability ,Bayesian decision theory ,levees ,asset management ,inspection ,uncertainty reduction ,risk-based decision making ,optimization ,flood defences ,maintenance - Abstract
Many countries rely on flood defence systems to prevent economic damage and loss-of-life due to catastrophic floods. Asset managers of flood defence systems need to cope with the consequences of structural degradation, and changing societal and environmental conditions, in order to satisfy performance requirements and optimize societal value of flood defence assets. This is a continuous effort of planning, executing and evaluating a variety of different system interventions. These can be aimed at both reducing the uncertainty on (e.g., inspection or monitoring), or improving the performance of a flood defence system (e.g., reinforcement). Performance is typically expressed as the reliability on a system level, which in this thesis is interpreted as the life-cycle reliability: the estimated reliability with all foreseen interventions in time. The key objective of this thesis is to improve decisions on life-cycle reliability of flood defence systems. This is elaborated for three key topics, with a focus on earthen flood defences (also known as levees or dikes)...
- Published
- 2022
8. Bayesian Decision Theory and Stochastic Independence
- Author
-
Philippe Mongin, HEC Paris - Recherche - Hors Laboratoire, Ecole des Hautes Etudes Commerciales (HEC Paris), HEC Research Paper Series, and Haldemann, Antoine
- Subjects
FOS: Computer and information sciences ,Stochastic independence ,History ,Property (philosophy) ,Computer science ,Savage ,050905 science studies ,lcsh:QA75.5-76.95 ,History and Philosophy of Science ,Probability theory ,Computer Science - Computer Science and Game Theory ,Representation (mathematics) ,Probability interpretations ,Preference (economics) ,Probability measure ,Bayes estimator ,lcsh:Mathematics ,05 social sciences ,Probabilistic Independence ,JEL: D - Microeconomics/D.D8 - Information, Knowledge, and Uncertainty/D.D8.D81 - Criteria for Decision-Making under Risk and Uncertainty ,Stochastic Independence ,Subjective expected utility ,lcsh:QA1-939 ,Philosophy ,Work (electrical) ,Mathematical development ,J2 ,[SHS.GESTION]Humanities and Social Sciences/Business administration ,lcsh:Electronic computers. Computer science ,Bayesian Decision Theory ,0509 other social sciences ,[SHS.GESTION] Humanities and Social Sciences/Business administration ,JEL: C - Mathematical and Quantitative Methods/C.C6 - Mathematical Methods • Programming Models • Mathematical and Simulation Modeling ,Mathematical economics ,JEL: D - Microeconomics/D.D8 - Information, Knowledge, and Uncertainty/D.D8.D89 - Other ,Computer Science and Game Theory (cs.GT) - Abstract
Stochastic independence has a complex status in probability theory. It is not part of the definition of a probability measure, but it is nonetheless an essential property for the mathematical development of this theory. Bayesian decision theorists such as Savage can be criticized for being silent about stochastic independence. From their current preference axioms, they can derive no more than the definitional properties of a probability measure. In a new framework of twofold uncertainty, we introduce preference axioms that entail not only these definitional properties, but also the stochastic independence of the two sources of uncertainty. This goes some way towards filling a curious lacuna in Bayesian decision theory., Comment: In Proceedings TARK 2017, arXiv:1707.08250
- Published
- 2020
9. A method to assess the value of monitoring an SHM system
- Author
-
Pier Francesco Giordano, Said Quqa, Maria Pina Limongelli, S.-T. Tu, Z. Peng, and Pier Francesco Giordano, Said Quqa, Maria Pina Limongelli
- Subjects
sensor fault ,History ,Bayesian decision theory ,Structural Health Monitoring ,data quality ,Computer Science Applications ,Education ,Value of Information - Abstract
Aging structural components, together with the increasing transportation needs and limited budgets, are challenging aspects that typically concern decision-makers and infrastructure owners. Although Structural Health Monitoring (SHM) has been a powerful tool to optimize maintenance-related activities and post-disaster emergency management, the sensor readout and, therefore, the outcome of the monitoring system is susceptible to errors due to malfunctioning. For years, the Value of Information (VoI) has been studied to quantify the long- term benefit of SHM systems against the initial investment in sensing instrumentation without considering the eventuality of faulty sensing nodes. However, these are very common in field applications. This paper proposes a new framework to calculate the benefit of using Sensor Validation Tools (SVTs) before calculating the damage-sensitive features that drive the SHM process. The novel approach extends the traditional Vol to consider multiple “health” states of the SHM system, associate the outcome of the SHM system with the state of both the structure and the SHM system, and quantify the additional value obtained from SVTs.
- Published
- 2022
10. Pipe failure assessment and decision support system for a smart operation and maintenance : A comprehensive literature review and a conceptual decision analysis model proposal
- Author
-
Meydani, Roya
- Subjects
Bayesian decision theory ,Problembeskrivning ,Nyttobaserad beslutsmodell ,Infrastruktur förvaltning ,Uncertainty ,Beslutsmod- ellering ,Decision analysis ,Problem structuring ,Civil Engineering ,Samhällsbyggnadsteknik ,Urbana ledningssystemsunderhåll ,Infrastructure management ,Urban pipeline systems’ rehabilitation ,Beslutsanalys ,Utility-based decision-making ,Osäkerhet ,Bayesiansk beslutsteori ,Decision-modeling - Abstract
The reported research provides a rough guide to the best practice of decision modeling concerning urban pipeline systems’ rehabilitation. The thesis aims to bring attention to the fact that a proper decision-making model is a cornerstone for efficient infrastructure management. More precisely, this thesis aims to increase the knowledge about applicable decision support methods by identifying relevant factors that should be considered in the decision-making process. This can, facilitate future rehabilitation attempts of existing urban infrastructure. A utility-based decision model was adopted for a water distribution network in Sweden to locate and rehabilitate leakages as an ultimate sign of failure. This was performed by implementing and evaluating a Bayesian decision model including the treatment of uncertainties in evaluating the best decision from a short-term perspective. Despite its simplicity, the result showed that the proposed model could facilitate problem-solving approaches when uncertainty is an issue. Considering the several interacting factors of services and the availability of information, the importance of problem structuring before applying a decision model was extensively acknowledged. As a result, a conceptual decision model was proposed to choose the most appropriate decision model applicable for a particular problem in the essence of deciding how to decide. The presented model illustrated the first steps of developing a theoretical framework for a rational yet practical decisionmaking. This approach, which is aimed to be further employed in rehabilitation strategies of urban pipelines, ensures that the chosen decision technique has explicitly considered different levels of uncertainty and would be the best-established solution for a particular type of problem, organization, and stakeholder. This effort may help the decision analysts define the problem and elicit objectives and values relatively early in the decision-making to ensure that decisions to be selected would support the desired outcomes, actions, and core values. Then, a critical evaluation of the decision strategy was presented by comparing the performed Bayesian approach with the proposed conceptual model. Then so, it was shown that the choice of the decision model is dissimilar if the presented specific basic components vary. This was performed by presenting two semi-fictitious case studies, exemplifying the framework’s importance in structuring the assessment of available means. Forskningen som redovisas i denna uppsats utgör en översiktlig guide till en praktisktillämpning av beslutsmodellering gällande underhåll av urbana ledningssystem.Syftet med licentiatuppsatsen är att betona att en korrekt modell för beslutsfat-tande är nödvändig för en effektiv förvaltning av infrastruktur. Mer specifikt ärmålet att öka kunskapen om tillämpbara beslutsstödsmetoder genom att identifiera relevanta faktorer som bör beaktas i beslutsprocessen. Det förväntas underlätta framtida underhållsaktiveter för befintlig urban infrastruktur. En nyttobaserad beslutsmodell för åtgärdsplanering har applicerats på en del av ettsvenskt vattenledningssystem, där läckage är den kritiska händelse som hanteras.Modellen baserad på Bayesiansk beslutsteori har implementerats och utvärderatsmed avseende på hantering av osäkerheter och beslutsoptimering ur ett korttidsper-spektiv. Trots modellens enkelhet visar resultatet att den kan underlätta metodvalför problemlösning när det råder osäkerheter i förutsättningarna. Vikten av en tydlig och strukturerad problembeskrivning inför tillämpningen av enbeslutsmodell bekräftas, där beaktande av interaktioner mellan ibland flera faktoreri systemets funktion och den tillgängliga informationen är viktig. Som ett resultatföreslås en konceptuell metod för att välja den mest lämpliga beslutsmodellen förett specifikt problem med syftet att besluta hur man bör besluta. Den presenter-ade metoden utgör ett första steg i utvecklingen av ett teoretiskt ramverk för ettrationellt och samtidigt praktiskt beslutsfattande. Arbetet hjälper beslutsfattarenatt strukturera problemet och lyfta syftet och värden tidigt i beslutsfattandet föratt säkerställa att tagna beslut stödjer eftersökta utfall, åtgärder och kärnvärden. Vidare har en kritisk utvärdering av beslutsstrategier presenterats som en jämförelsemellan den Bayesianska beslutsmodellen och den konceptuella metoden. Den visaratt valet av beslutsmodell skiljer sig om de grundläggande förutsättningarna ärolika. Utvärderingen baseras på två semifiktiva fallstudier som visar på vikten avstrukturering i bedömningen av tillgänglig information och tillgängliga resurser. 2022-10-24 Mistra InfraMaint
- Published
- 2022
11. An alternative quantification of the value of information in structural health monitoring
- Author
-
Michael D. Todd, Mayank Chadha, and Zhen Hu
- Subjects
Bayesian decision theory ,Computer science ,0211 other engineering and technologies ,Biophysics ,020101 civil engineering ,02 engineering and technology ,Machine learning ,computer.software_genre ,digital twins ,miter gates ,0201 civil engineering ,Value of information ,Engineering ,021105 building & construction ,Bayes estimator ,structural health monitoring ,Mechanism (biology) ,business.industry ,Mechanical Engineering ,Acoustics ,value of information ,pre-posterior analysis ,machine learning ,Generic health relevance ,Artificial intelligence ,Structural health monitoring ,business ,computer ,behavioral psychology - Abstract
Analogous to an experiment, a structural health monitoring (SHM) system may be thought of as an information-gathering mechanism. Gathering the information that is representative of the structural state and correctly inferring its meaning helps engineers (decision-makers) mitigate possible losses by taking appropriate actions (risk-informed decision-making). However, the design, research, development, installation, maintenance, and operation of an SHM system are an expensive endeavor. Therefore, the decision to invest in new information is rationally justified if the reduction in the expected losses by utilizing newly acquired information is more than the intrinsic cost of the information acquiring mechanism incurred over the lifespan of the structure. This article investigates the economic advantage of installing an SHM system for inference of the structural state, risk, and lifecycle management by using the value of information (VoI) analysis. Among many possible choices of SHM system designs (different information-gathering mechanisms), pre-posterior decision analysis can be used to select the most feasible design. Traditionally, the cost–benefit analysis of an SHM system is carried out through pre-posterior decision analysis that helps one evaluate the benefit of an experiment or an information-gathering mechanism using the expected value of information metric. This study proposes an alternate normalized metric that evaluates the expected reward ratio (benefit/gain of using an SHM system) relative to the investment risk (cost of SHM over the lifecycle). The analysis of evaluating the relative benefit of various SHM system designs is carried out by considering the concept of the VoI, by performing pre-posterior analysis, and the idea of a perfect experiment is discussed.
- Published
- 2021
12. Value of information in multiple criteria decision making: an application to forest conservation
- Author
-
Kyle Eyvindson, Jussi Hakanen, Juha Karvanen, Artti Juutinen, and Mikko Mönkkönen
- Subjects
0106 biological sciences ,Forest planning ,Environmental Engineering ,Bayesian decision theory ,010504 meteorology & atmospheric sciences ,Operations research ,Computer science ,päätöksenteko ,Computational intelligence ,Ecological data ,010603 evolutionary biology ,01 natural sciences ,Value of information ,optimointi ,Environmental Chemistry ,simulointi ,conservation planning ,Safety, Risk, Reliability and Quality ,0105 earth and related environmental sciences ,General Environmental Science ,Water Science and Technology ,decision analysis ,bayesilainen menetelmä ,simulation ,Decision maker ,monitavoiteoptimointi ,Preference ,metsiensuojelu ,kriteerit ,trade-offs ,Multiple criteria ,information updating ,luonnonsuojelu ,kompromissit ,optimization ,Value (mathematics) - Abstract
Developing environmental conservation plans involves assessing trade-offs between the benefits and costs of conservation. The benefits of conservation can be established with ecological inventories or estimated based on previously collected information. Conducting ecological inventories can be costly, and the additional information may not justify these costs. To clarify the value of these inventories, we investigate the multiple criteria value of information associated with the acquisition of improved ecological data. This information can be useful when informing the decision maker to acquire better information. We extend the concept of the value of information to a multiple criteria perspective. We consider value of information for both monetary and biodiversity criteria and do not assume any fixed budget limits. Two illustrative cases are used describe this method of evaluating the multiple criteria value of information. In the first case, we numerically evaluate the multiple criteria value of information for a single forest stand. In the second case, we present a forest planning case with four stands that describes the complex interactions between the decision maker’s preference information and the potential inventory options available. These example cases highlight the importance of examining the trade-offs when making conservation decisions. We provide a definition for the multiple criteria value of information and demonstrate the potential application when conservation issues conflict with monetary issues.
- Published
- 2019
13. Countable additivity, idealization, and conceptual realism
- Author
-
Yang Liu, Liu, Yang [0000-0001-8865-4647], and Apollo - University of Cambridge Repository
- Subjects
foundations of probability ,Economics and Econometrics ,Bayes estimator ,countable additivity ,Bayesian decision theory ,Computer science ,Decision theory ,05 social sciences ,Bayesian probability ,06 humanities and the arts ,Subjective expected utility ,conceptual realism ,0603 philosophy, ethics and religion ,idealization ,050105 experimental psychology ,Philosophy ,060302 philosophy ,Idealization ,0501 psychology and cognitive sciences ,Mathematical structure ,Sigma additivity ,Mathematical economics ,Axiom - Abstract
This paper addresses the issue of finite versus countable additivity in Bayesian probability and decision theory – in particular, Savage’s theory of subjective expected utility and personal probability. I show that Savage’s reason for not requiring countable additivity in his theory is inconclusive. The assessment leads to an analysis of various highly idealized assumptions commonly adopted in Bayesian theory, where I argue that a healthy dose of, what I call, conceptual realism is often helpful in understanding the interpretational value of sophisticated mathematical structures employed in applied sciences like decision theory. In the last part, I introduce countable additivity into Savage’s theory and explore some technical properties in relation to other axioms of the system.
- Published
- 2019
14. Competition Rather Than Observation and Cooperation Facilitates Optimal Motor Planning
- Author
-
Mamoru Tanae, Ken Takiyama, and Keiji Ota
- Subjects
0301 basic medicine ,Bayesian decision theory ,Property (programming) ,media_common.quotation_subject ,Task (project management) ,Competition (economics) ,lcsh:GV557-1198.995 ,03 medical and health sciences ,0302 clinical medicine ,motor uncertainty ,Perception ,media_common ,Original Research ,lcsh:Sports ,Motor planning ,Point (typography) ,risk-sensitivity ,Cognition ,aim point ,030104 developmental biology ,Action (philosophy) ,Sports and Active Living ,motor decision-making ,Psychology ,030217 neurology & neurosurgery ,Cognitive psychology - Abstract
Humans tend to select motor planning with a high reward and low success compared with motor planning, which has a small reward and high success rate. Previous studies have shown such a risk-seeking property in motor decision tasks. However, it is unclear how to facilitate a shift from risk-seeking to optimal motor planning that maximizes the expected reward. Here, we investigate the effect of interacting with virtual partners/opponents on motor plans since interpersonal interaction has a powerful influence on human perception, action, and cognition. This study compared three types of interactions (competition, cooperation, and observation) and two types of virtual partners/opponents (those engaged in optimal motor planning and those engaged in risk-averse motor planning). As reported in previous studies, the participants took a risky aim point when they performed a motor decision task alone. However, we found that the participant's aim point was significantly modulated when they performed the same task while competing with a risk-averse opponent (p = 0.018) and that there was no significant difference from the optimal aim point (p = 0.63). No significant modulation in the aim points was observed during the cooperation and observation tasks. These results highlight the importance of competition for modulating suboptimal decision-making and optimizing motor performance.
- Published
- 2021
15. Minor or adult? Introducing decision analysis in forensic age estimation
- Author
-
Silvia Bozza, Emanuele Sironi, Franco Taroni, and Simone Gittelson
- Subjects
Ultimate issue ,Bayesian decision theory ,Evidence interpretation ,Process (engineering) ,Decision theory ,01 natural sciences ,Evidence evaluation ,Decision Support Techniques ,Pathology and Forensic Medicine ,03 medical and health sciences ,0302 clinical medicine ,Humans ,030216 legal & forensic medicine ,Decision consequences ,Structure (mathematical logic) ,Actuarial science ,010401 analytical chemistry ,Forensic Medicine ,Chronological age, Bayesian decision theory, Evidence evaluation, Evidence interpretation, Loss function, Decision consequences ,Loss function ,0104 chemical sciences ,Age of majority ,Normative ,Expert report ,Legal & Forensic Medicine ,Settore SECS-S/01 - Statistica ,Psychology ,Chronological age ,Decision analysis - Abstract
Nowadays, forensic age estimation takes an important role in worldwide forensic and medico-legal institutes that are solicited by judicial or administrative authorities for providing an expert report on the age of individuals. The authorities’ ultimate issue of interest is often the probability that the person is younger or older than a given age threshold, which is usually the age of majority. Such information is fundamental for deciding whether a person being judged falls under the legal category of an adult. This is a decision that may have important consequences for the individual, depending on the legal framework in which the decision is made. The aim of this paper is to introduce a normative approach for assisting the authority in the decision-making process given knowledge from available findings reported by means of probabilities. The normative approach proposed here has been acknowledged in the forensic framework, and represents a promising structure for reasoning that can support the decision-making process in forensic age estimation. The paper introduces the fundamental elements of decision theory applied to the specific case of age estimation, and provides some examples to illustrate its practical application.
- Published
- 2021
16. Theory of Optimal Bayesian Feature Filtering
- Author
-
Ali Foroughi pour and Lori A. Dalton
- Subjects
FOS: Computer and information sciences ,Statistics and Probability ,Computer Science - Machine Learning ,Bayesian decision theory ,62C10 ,Computer science ,Computer Vision and Pattern Recognition (cs.CV) ,Gaussian ,Bayesian probability ,Computer Science - Computer Vision and Pattern Recognition ,Machine Learning (stat.ML) ,Feature selection ,Bayesian inference ,Machine Learning (cs.LG) ,symbols.namesake ,Statistics - Machine Learning ,biomarker discovery ,62F07 ,Feature (machine learning) ,Bayes estimator ,Applied Mathematics ,Filter (signal processing) ,92C37 ,If and only if ,symbols ,62F15, 62C10, 62F07, 92C37 ,62F15 ,Algorithm ,variable selection - Abstract
Optimal Bayesian feature filtering (OBF) is a supervised screening method designed for biomarker discovery. In this article, we prove two major theoretical properties of OBF. First, optimal Bayesian feature selection under a general family of Bayesian models reduces to filtering if and only if the underlying Bayesian model assumes all features are mutually independent. Therefore, OBF is optimal if and only if one assumes all features are mutually independent, and OBF is the only filter method that is optimal under at least one model in the general Bayesian framework. Second, OBF under independent Gaussian models is consistent under very mild conditions, including cases where the data is non-Gaussian with correlated features. This result provides conditions where OBF is guaranteed to identify the correct feature set given enough data, and it justifies the use of OBF in non-design settings where its assumptions are invalid., 51 pages, 5 figures, 6 tables
- Published
- 2020
17. A Bayesian Decision Theory Approach for Genomic Selection
- Author
-
Juan Burgueño, Bartolo de Jesús Villar-Hernández, Fernando H. Toledo, José Crossa, Paulino Pérez-Rodríguez, and Sergio Pérez-Elizalde
- Subjects
0106 biological sciences ,Multivariate statistics ,Breeding program ,Bayesian probability ,Population ,QH426-470 ,Biology ,01 natural sciences ,Loss Function ,Shared Data Resources ,010104 statistics & probability ,Quantitative Trait, Heritable ,Statistics ,Genetics ,Selection, Genetic ,0101 mathematics ,education ,Molecular Biology ,Genetics (clinical) ,Selection (genetic algorithm) ,Bayes estimator ,education.field_of_study ,Genome ,Models, Genetic ,Simulation Scenarios ,Univariate ,Bayes Theorem ,Heritability ,Genomic Selection ,GenPred ,Bayesian Decision Theory ,010606 plant biology & botany - Abstract
Plant and animal breeders are interested in selecting the best individuals from a candidate set for the next breeding cycle. In this paper, we propose a formal method under the Bayesian decision theory framework to tackle the selection problem based on genomic selection (GS) in single- and multi-trait settings. We proposed and tested three univariate loss functions (Kullback-Leibler, KL; Continuous Ranked Probability Score, CRPS; Linear-Linear loss, LinLin) and their corresponding multivariate generalizations (Kullback-Leibler, KL; Energy Score, EnergyS; and the Multivariate Asymmetric Loss Function, MALF). We derived and expressed all the loss functions in terms of heritability and tested them on a real wheat dataset for one cycle of selection and in a simulated selection program. The performance of each univariate loss function was compared with the standard method of selection (Std) that does not use loss functions. We compared the performance in terms of the selection response and the decrease in the population’s genetic variance during recurrent breeding cycles. Results suggest that it is possible to obtain better performance in a long-term breeding program using the single-trait scheme by selecting 30% of the best individuals in each cycle but not by selecting 10% of the best individuals. For the multi-trait approach, results show that the population mean for all traits under consideration had positive gains, even though two of the traits were negatively correlated. The corresponding population variances were not statistically different from the different loss function during the 10th selection cycle. Using the loss function should be a useful criterion when selecting the candidates for selection for the next breeding cycle.
- Published
- 2018
18. Are Multi-Armed Bandits Susceptible to Peeking?
- Author
-
Markus Loecher
- Subjects
HF5001-6182 ,Computer science ,media_common.quotation_subject ,Bayesian probability ,Posterior probability ,computer.software_genre ,050105 experimental psychology ,03 medical and health sciences ,a/b testing ,0302 clinical medicine ,Frequentist inference ,bayesian decision theory ,Statistics ,Prior probability ,Business ,0501 psychology and cognitive sciences ,A/B testing ,multiple comparisons ,Bayesian decision theory ,General Environmental Science ,media_common ,Bayes estimator ,05 social sciences ,Multiple comparisons problem ,General Earth and Planetary Sciences ,Habit ,computer ,030217 neurology & neurosurgery - Abstract
A standard method to evaluate new features and changes to e.g. Web sites is A/B testing. A common pitfall in performing A/B testing is the habit of looking at a test while it’s running, then stopping early. Due to the implicit multiple testing, the p-value is no longer trustworthy and usually too small. We investigate the claim that Bayesian methods, unlike frequentist tests, are immune to this “peeking” problem. We demonstrate that two regularly used measures, namely posterior probability and value remaining are severely affected by repeated testing. We further show a strong dependence on the prior probability of the parameters of interest.
- Published
- 2018
19. Analysing and exemplifying forensic conclusion criteria in terms of Bayesian decision theory
- Author
-
Franco Taroni, Silvia Bozza, and Alex Biedermann
- Subjects
Bayesian decision theory ,Decision theory ,Decision Making ,01 natural sciences ,Pathology and Forensic Medicine ,Scientific evidence ,03 medical and health sciences ,Decision Theory ,Conclusion criteria ,0302 clinical medicine ,Decision-making, Conclusion criteria, Bayesian decision theory, Normative approach ,Handwriting ,Kinship ,Humans ,Normative approach ,030216 legal & forensic medicine ,Likelihood Functions ,Management science ,Forensic Sciences ,010401 analytical chemistry ,Perspective (graphical) ,Bayes Theorem ,16. Peace & justice ,Multiple-criteria decision analysis ,Formal methods ,0104 chemical sciences ,Normative ,Settore SECS-S/01 - Statistica ,Psychology ,Decision-making - Abstract
There is ongoing discussion in forensic science and the law about the nature of the conclusions reached based on scientific evidence, and on how such conclusions – and conclusion criteria – may be justified by rational argument. Examples, among others, are encountered in fields such as fingermarks (e.g., ‘this fingermark comes from Mr. A's left thumb’), handwriting examinations (e.g., ‘the questioned signature is that of Mr. A’), kinship analyses (e.g., ‘Mr. A is the father of child C’) or anthropology (e.g., ‘these are human remains'). Considerable developments using formal methods of reasoning based on, for example (Bayesian) decision theory, are available in literature, but currently such reference principles are not explicitly used in operational forensic reporting and ensuing decision-making. Moreover, applied examples, illustrating the principles, are scarce. A potential consequence of this in practical proceedings, and hence a cause of concern, is that underlying ingredients of decision criteria (such as losses quantifying the undesirability of adverse decision consequences), are not properly dealt with. There is merit, thus, in pursuing the study and discussion of practical examples, demonstrating that formal decision-theoretic principles are not merely conceptual considerations. Actually, these principles can be shown to underpin practical decision-making procedures and existing legal decision criteria, though often not explicitly apparent as such. In this paper, we will present such examples and discuss their properties from a Bayesian decision-theoretic perspective. We will argue that these are essential concepts for an informed discourse on decision-making across forensic disciplines and the development of a coherent view on this topic. We will also emphasize that these principles are of normative nature in the sense that they provide standards against which actual judgment and decision-making may be compared. Most importantly, these standards are justified independently of peoples' observable decision behaviour, and of whether or not one endorses these formal methods of reasoning.
- Published
- 2018
20. The integration of bridge life cycle cost analysis and the value of structural health monitoring information
- Author
-
Du, Guangli, Qin, Jianjun, Caspeele, Robby, Taerwe, Luc, and Frangopol, Dan M.
- Subjects
Structural health monitoring ,Bayesian Decision Theory ,life cycle cost ,value of information - Abstract
Life cycle cost (LCC) is a vital economic evaluation tool for effective asset management towards sustainability. Especially for bridges, their maintenance and operation cost overwhelms the cost from any other stage. One obstacle in the current bridge LCC implementation is the inherent uncertainties for the input parameters. That is, without knowing the structural performance, most of the existing LCC models are only established based on the assumptions of various life cycle parameters. For instance, the remaining ser-vice life (RSL) of structural components, a key input in LCC, is highly related to the bridge deterioration process but often assumed deterministically in bridge LCC. With the advent of advanced structural health monitoring (SHM) techniques, the structural performance can be observed from the real-time monitoring data, whereas the structural RSL can also be estimated. Here, a recognized issue is that, the trade-off between the benefit and the cost of installing the monitoring system is rarely considered in the existing bridge LCC practice, i.e. the value of information (VoI). Hence, the present paper proposes an improved LCC framework with an illustrative case study, taking account of both of the structural deterioration mechanism and the quantification of the value the SHM information.
- Published
- 2018
21. Decision-theoretic designs for a series of trials with correlated treatment effects using the Sarmanov multivariate beta-binomial distribution
- Author
-
Nigel Stallard, Siew Wan Hee, and Nicholas R. Parsons
- Subjects
Statistics and Probability ,Biometry ,Bayesian decision theory ,Population ,01 natural sciences ,law.invention ,Special Issue: ISCB 2016 ,010104 statistics & probability ,03 medical and health sciences ,0302 clinical medicine ,Randomized controlled trial ,Frequentist inference ,law ,Prior probability ,Statistics ,Econometrics ,Humans ,030212 general & internal medicine ,0101 mathematics ,education ,QA ,correlated trials ,Mathematics ,Statistical hypothesis testing ,Bayes estimator ,education.field_of_study ,Clinical Trials as Topic ,Models, Statistical ,General Medicine ,R1 ,3. Good health ,Binomial distribution ,Binomial Distribution ,Treatment Outcome ,Beta-binomial distribution ,backward induction ,bivariate beta distribution ,Sarmanov beta‐binomial ,Multivariate Analysis ,Statistics, Probability and Uncertainty ,Research Paper - Abstract
The motivation for the work in this article is the setting in which a number of treatments are available for evaluation in phase II clinical trials and where it may be infeasible to try them concurrently because the intended population is small. This paper introduces an extension of previous work on decision‐theoretic designs for a series of phase II trials. The program encompasses a series of sequential phase II trials with interim decision making and a single two‐arm phase III trial. The design is based on a hybrid approach where the final analysis of the phase III data is based on a classical frequentist hypothesis test, whereas the trials are designed using a Bayesian decision‐theoretic approach in which the unknown treatment effect is assumed to follow a known prior distribution. In addition, as treatments are intended for the same population it is not unrealistic to consider treatment effects to be correlated. Thus, the prior distribution will reflect this. Data from a randomized trial of severe arthritis of the hip are used to test the application of the design. We show that the design on average requires fewer patients in phase II than when the correlation is ignored. Correspondingly, the time required to recommend an efficacious treatment for phase III is quicker.
- Published
- 2018
22. Gaussian Mixture Reduction for Time-Constrained Approximate Inference in Hybrid Bayesian Networks
- Author
-
Paulo C. G. Costa, Shou Matsumoto, Cheol Young Park, and Kathryn B. Laskey
- Subjects
FOS: Computer and information sciences ,Bayesian decision theory ,Computer science ,Computer Science - Artificial Intelligence ,Gaussian ,Inference ,02 engineering and technology ,01 natural sciences ,lcsh:Technology ,Reduction (complexity) ,lcsh:Chemistry ,010104 statistics & probability ,symbols.namesake ,0202 electrical engineering, electronic engineering, information engineering ,General Materials Science ,0101 mathematics ,Instrumentation ,Time complexity ,lcsh:QH301-705.5 ,Fluid Flow and Transfer Processes ,Bayes estimator ,lcsh:T ,Process Chemistry and Technology ,General Engineering ,Bayesian network ,artificial intelligence ,hybrid Bayesian network ,message passing algorithm ,lcsh:QC1-999 ,Computer Science Applications ,time-constrained inference ,Approximate inference ,Artificial Intelligence (cs.AI) ,lcsh:Biology (General) ,lcsh:QD1-999 ,lcsh:TA1-2040 ,symbols ,Gaussian mixture reduction ,020201 artificial intelligence & image processing ,lcsh:Engineering (General). Civil engineering (General) ,Random variable ,Algorithm ,lcsh:Physics - Abstract
Hybrid Bayesian Networks (HBNs), which contain both discrete and continuous variables, arise naturally in many application areas (e.g., image understanding, data fusion, medical diagnosis, fraud detection). This paper concerns inference in an important subclass of HBNs, the conditional Gaussian (CG) networks, in which all continuous random variables have Gaussian distributions and all children of continuous random variables must be continuous. Inference in CG networks can be NP-hard even for special-case structures, such as poly-trees, where inference in discrete Bayesian networks can be performed in polynomial time. Therefore, approximate inference is required. In approximate inference, it is often necessary to trade off accuracy against solution time. This paper presents an extension to the Hybrid Message Passing inference algorithm for general CG networks and an algorithm for optimizing its accuracy given a bound on computation time. The extended algorithm uses Gaussian mixture reduction to prevent an exponential increase in the number of Gaussian mixture components. The trade-off algorithm performs pre-processing to find optimal run-time settings for the extended algorithm. Experimental results for four CG networks compare performance of the extended algorithm with existing algorithms and show the optimal settings for these CG networks.
- Published
- 2018
- Full Text
- View/download PDF
23. Decision-theoretic designs for small trials and pilot studies: A review
- Author
-
Thomas Hamborg, Siew Wan Hee, Martin Posch, Nigel Stallard, Sarah Zohar, Simon Day, Frank Miller, Jason Madan, University of Warwick [Coventry], Clinical Trials Consulting and Training Limited [Buckingham], Stockholm University, Medizinische Universität Wien = Medical University of Vienna, Centre de Recherche des Cordeliers (CRC (UMR_S_1138 / U1138)), École pratique des hautes études (EPHE), Université Paris sciences et lettres (PSL)-Université Paris sciences et lettres (PSL)-Université Paris Diderot - Paris 7 (UPD7)-Université Paris Descartes - Paris 5 (UPD5)-Institut National de la Santé et de la Recherche Médicale (INSERM)-Sorbonne Université (SU), and Zohar, Sarah
- Subjects
Statistics and Probability ,Optimal design ,Bayesian decision theory ,Operations research ,Epidemiology ,[SDV]Life Sciences [q-bio] ,Pilot Projects ,optimal clinical trial design ,01 natural sciences ,010104 statistics & probability ,03 medical and health sciences ,Clinical Trials, Phase II as Topic ,Decision Theory ,0302 clinical medicine ,Health Information Management ,Interim ,Humans ,Medicine ,Sannolikhetsteori och statistik ,030212 general & internal medicine ,0101 mathematics ,Probability Theory and Statistics ,Bioinformatics (Computational Biology) ,Bayes estimator ,utility functions ,business.industry ,Bayes Theorem ,Articles ,Decision maker ,phase II clinical trials ,3. Good health ,Variety (cybernetics) ,[SDV] Life Sciences [q-bio] ,Clinical trial ,Drug development ,[SDV.SPEE] Life Sciences [q-bio]/Santé publique et épidémiologie ,Research Design ,Bioinformatik (beräkningsbiologi) ,[SDV.SPEE]Life Sciences [q-bio]/Santé publique et épidémiologie ,Construct (philosophy) ,business ,RA - Abstract
Pilot studies and other small clinical trials are often conducted but serve a variety of purposes and there is little consensus on their design. One paradigm that has been suggested for the design of such studies is Bayesian decision theory. In this article, we review the literature with the aim of summarizing current methodological developments in this area. We find that decision-theoretic methods have been applied to the design of small clinical trials in a number of areas. We divide our discussion of published methods into those for trials conducted in a single stage, those for multi-stage trials in which decisions are made through the course of the trial at a number of interim analyses, and those that attempt to design a series of clinical trials or a drug development programme. In all three cases, a number of methods have been proposed, depending on the decision maker’s perspective being considered and the details of utility functions that are used to construct the optimal design. EU project InSPiRe (Innovative methodology for small populations research); FP HEALTH 2013 – 602144
- Published
- 2015
24. Segmentação de imagens coloridas por árvores bayesianas adaptativas
- Author
-
Peixoto, Guilherme Garcia Schu and Scharcanski, Jacob
- Subjects
Algoritmos ,Bayesian decision theory ,Teoria da decisão ,Color image segmentation ,Directed trees ,Segmentação de imagem ,Clustering - Abstract
A segmentação de imagens consiste em urna tarefa de fundamental importância para diferentes aplicações em visão computacional, tais como por exemplo, o reconhecimento e o rastreamento de objetos, a segmentação de tomores/lesões em aplicações médicas, podendo também servir de auxílio em sistemas de reconhecimento facial. Embora exista uma extensa literatora abordando o problema de segmentação de imagens, tal tópico ainda continua em aberto para pesquisa. Particularmente, a tarefa de segmentar imagens coloridas é desafiadora devido as diversas inomogeneidades de cor, texturas e formas presentes nas feições descritivas das imagens. Este trabalho apresenta um novo método de clustering para abordar o problema da segmentação de imagens coloridas. Nós desenvolvemos uma abordagem Bayesiana para procura de máximos de densidade em urna distribuição discreta de dados, e representamos os dados de forma hierárquica originando clusters adaptativos a cada nível da hierarquia. Nós aplicamos o método de clustering proposto no problema de segmentação de imagens coloridas, aproveitando sua estrutura hierárquica, baseada em propriedades de árvores direcionadas, para representar hierarquicamente uma imagem colorida. Os experimentos realizados revelaram que o método de clustering proposto, aplicado ao problema de segmentação de imagens coloridas, obteve para a medida de performance Probabilistic Rand lndex (PRI) o valor de 0.8148 e para a medida Global Consistency Error (GCE) o valor 0.1701, superando um total de vinte e um métodos previamente propostos na literatura para o banco de dados BSD300. Comparações visuais confirmaram a competitividade da nossa abordagem em relação aos demais métodos testados. Estes resultados enfatizam a potencialidade do nosso método de clustering para abordar outras aplicações no domínio de Visão Computacional e Reconhecimento de Padrões. Image segmentation is an essential task for several computer vision applications, such as object recognition, tracking and image retrieval. Although extensively studied in the literature, the problem of image segmentation remains an open topic of research. Particularly, the task of segmenting color images is challenging due to the inhomogeneities in the color regions encountered in natural scenes, often caused by the shapes of surfaces and their interactions with the illumination sources (e.g. causing shading and highlights) This work presents a novel non-supervised classification method. We develop a Bayesian framework for seeking modes on the underlying discrete distribution of data and we represent data hierarchically originating adaptive clusters at each levei of hierarchy. We apply the prnposal clustering technique for tackling the problem of color irnage segmentation, taking advantage of its hierarchical structure based on hierarchy properties of directed trees for representing fine to coarse leveis of details in an image. The experiments herein conducted revealed that the proposed clustering method applied to the color image segmentation problem, achieved for the Probabilistic Rand Index (PRI) performance measure the value of 0.8148 and for the Global Consistency Error (GCE) the value of 0.1701, outperforming twenty-three methods previously proposed in the literature for the BSD300 dataset. Visual comparison confirmed the competitiveness of our approach towards state-of-art methods publicly available in the literature. These results emphasize the great potential of our proposed clustering technique for tackling other applications in computer vision and pattem recognition.
- Published
- 2017
25. Bayesian Decision Making in Groups is Hard
- Author
-
Jan Hązła, M. Amin Rahimian, Elchanan Mossel, and Ali Jadbabaie
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Theoretical computer science ,Computational complexity theory ,Computer science ,Bayesian probability ,Mathematics - Statistics Theory ,Statistics Theory (math.ST) ,Management Science and Operations Research ,inference over graphs ,Computational Complexity (cs.CC) ,Machine Learning (cs.LG) ,Search algorithm ,bayesian decision theory ,0502 economics and business ,group decision making ,FOS: Mathematics ,Computer Science - Multiagent Systems ,050207 economics ,Time complexity ,Expected utility hypothesis ,050205 econometrics ,Social and Information Networks (cs.SI) ,Transitive relation ,Bayes estimator ,computational complexity ,05 social sciences ,Computer Science - Social and Information Networks ,computational social choice ,Computer Science Applications ,Group decision-making ,Computer Science - Computational Complexity ,observational learning ,91B06, 68Q25, 91A35, 62C10 ,Multiagent Systems (cs.MA) - Abstract
We study the computations that Bayesian agents undertake when exchanging opinions over a network. The agents act repeatedly on their private information and take myopic actions that maximize their expected utility according to a fully rational posterior belief. We show that such computations are NP-hard for two natural utility functions: one with binary actions, and another where agents reveal their posterior beliefs. In fact, we show that distinguishing between posteriors that are concentrated on different states of the world is NP-hard. Therefore, even approximating the Bayesian posterior beliefs is hard. We also describe a natural search algorithm to compute agents' actions, which we call elimination of impossible signals, and show that if the network is transitive, the algorithm can be modified to run in polynomial time.
- Published
- 2017
- Full Text
- View/download PDF
26. The sure-thing principle and P2
- Author
-
Yang Liu, Liu, Yang [0000-0001-8865-4647], and Apollo - University of Cambridge Repository
- Subjects
Economics and Econometrics ,Bayes estimator ,Conditional preference ,Bayesian decision theory ,The sure-thing principle ,0102 computer and information sciences ,06 humanities and the arts ,Dominance principle ,0603 philosophy, ethics and religion ,01 natural sciences ,Sure-thing principle ,010201 computation theory & mathematics ,060302 philosophy ,Mathematical economics ,Conditionality principle ,Finance ,Mathematics - Abstract
This paper offers a fine analysis of different versions of the well known sure-thing principle. We show that Savage’s formal formulation of the principle, i.e., his second postulate (P2), is strictly stronger than what is intended originally.
- Published
- 2017
27. Research on an Optimized Leakage Locating Model in Water Distribution System
- Author
-
Wenyan Wu, Jinliang Gao, Y. Qiao, M. Tu, J. Wang, and S. Qi
- Subjects
Bayes estimator ,Engineering ,Distribution networks ,business.industry ,Calibration (statistics) ,Genetic Algorithms ,General Medicine ,computer.software_genre ,Leakage locating ,Distribution system ,water distribution networks ,Data mining ,Bayesian Decision Theory ,model calibration ,business ,computer ,Engineering(all) ,Leakage (electronics) - Abstract
The paper investigates an optimized leakage locating model combining Genetic Algorithms and Bayesian Decision Theory in water distribution networks. Both of the leakage detection models based on model calibration and Bayesian Decision Theory are described in the paper and contribute to the establishment of the optimized model. A numerical example network was used for evaluation of the optimized model and proves its excellent detection accuracy and outstanding locating efficiency.
- Published
- 2014
28. Multi-SOM: an Algorithm for High-Dimensional, Small Size Datasets
- Author
-
Shen Lu and Richard S. Segall
- Subjects
ComputingMethodologies_PATTERNRECOGNITION ,lcsh:T58.5-58.64 ,Sample Selection ,lcsh:Information technology ,Feature selection ,Weights Vector ,Bayesian Decision Theory ,lcsh:P87-96 ,Self-Organizing Maps ,lcsh:Communication. Mass media - Abstract
Since it takes time to do experiments in bioinformatics, biological datasets are sometimes small but with high dimensionality. From probability theory, in order to discover knowledge from a set of data, we have to have a sufficient number of samples. Otherwise, the error bounds can become too large to be useful. For the SOM (Self- Organizing Map) algorithm, the initial map is based on the training data. In order to avoid the bias caused by the insufficient training data, in this paper we present an algorithm, called Multi-SOM. Multi-SOM builds a number of small self-organizing maps, instead of just one big map. Bayesian decision theory is used to make the final decision among similar neurons on different maps. In this way, we can better ensure that we can get a real random initial weight vector set, the map size is less of consideration and errors tend to average out. In our experiments as applied to microarray datasets which are highly intense data composed of genetic related information, the precision of Multi-SOMs is 10.58% greater than SOMs, and its recall is 11.07% greater than SOMs. Thus, the Multi-SOMs algorithm is practical.
- Published
- 2013
29. Variable Order Transition Probability Markov Decision Process for the Recommendation System
- Subjects
recommendation ,reinforcement learning ,Bayesian decision theory ,マルコフ決定過程 ,強化学習 ,推薦問題 ,ベイズ決定理論 ,Markov decision process - Abstract
推薦問題を扱うためのより一般化されたマルコフ決定過程モデルに対して,ベイズ基準のもとで最適な推薦ルールを履歴データから求める方法を提案する.推薦問題に関する研究において,これまで,ある商品を推薦した結果どの商品が買われたのか(推薦結果)や,さらには,一定期間内に行った複数の推薦結果が考慮されることはほとんどなかった.これに対して,マルコフ決定過程モデルを用いることで上記2点を初めて考慮した手法が提案されている.提案法は,その従来研究のモデルを一般化した点に新規性がある.また,もう1つの新規性として,推薦ルールを求めるためのプロセスを統計的決定問題として厳密に定式化した点がある.従来のモデルを一般化することで,マルコフ決定過程モデルを用いた推薦手法の適用領域が拡大され,かつ,推薦する目的に対して最適な推薦が行えるようになった.人工データを用いた評価実験により,提案する推薦手法の有効性を確認した.In this paper, we propose a general markov decision process model for the recommendation system. Furthermore, by using historical data, we derive the optimal recommendation lists from the proposed model based on bayesian decision theory. In the recommendation research area, there were little studies that considered both the purchased items and the past recommended items within a given period. In these circumstances, markov decision process based recommend method that can take these two things into account has been proposed. Our method also uses both things as with the previous method. Here, the unique thing about this paper is not only that we generalize the existing model, but also that we formulate the process to get the recommendation lists as the statistical decision problem. As a result, we can obtain the most suitable recommendation lists with respect to the purpose of the recommendation for a wide variety of recommendation scene. By using artificial data, we show the experimental results that our method can obtain more rewards than the conventional method gets.
- Published
- 2013
30. Decision-theoretic reflections on processing a fingermark
- Author
-
Franco Taroni, Silvia Bozza, Simone Gittelson, and Alex Biedermann
- Subjects
Value of information ,Bayesian decision theory ,Point (typography) ,Computer science ,Cost-Benefit Analysis ,Fingerprints ,Influence diagram ,Context (language use) ,Data science ,Field (computer science) ,Pathology and Forensic Medicine ,Decision Theory ,Workflow ,Humans ,Dermatoglyphics ,Laboratories ,Law - Abstract
A recent publication in this journal [1] presented the results of a field study that revealed the data provided by the fingermarks not processed in a forensic science laboratory. In their study, the authors were interested in the usefulness of this additional data in order to determine whether such fingermarks would have been worth submitting to the fingermark processing workflow. Taking these ideas as a starting point, this communication here places the fingermark in its context of a case brought before a court, and examines the question of processing or not processing a fingermark from a decision-theoretic point of view. The decision-theoretic framework presented provides an answer to this question in the form of a quantified expression of the expected value of information (EVOI) associated with the processed fingermark, which can then be compared with the cost of processing the mark.
- Published
- 2013
31. Random Sampling of Beef Cattle for Genetic Testing: Optimal Sample Size Determination
- Author
-
Thompson, Nathanael M., Brorsen, B. Wade, DeVuyst, Eric A., and Lusk, Jayson L.
- Subjects
Production Economics ,Bayesian decision theory ,sample size determination ,Livestock Production/Industries ,random sampling ,Farm Management ,beef cattle genetics - Abstract
Sample size is often dictated by budget and acceptable error bounds. However, there are many economic problems where sample size directly affects a benefit or loss function, and in these cases, sample size is an endogenous variable. We introduce an economic approach to sample size determination utilizing a Bayesian decision theoretic framework that balances the expected costs and benefits of sampling using a Bayesian prior distribution for the unknown parameters. To demonstrate the method for a relevant applied economics problem, we turn to randomly sampling beef cattle for genetic testing. A theoretical model is developed, and several simplifying assumptions are made to solve the problem analytically. Data from 101 pens (2,796 animals) of commercially-fed cattle are then used to evaluate this solution empirically. Results indicate that at the baseline parameter values an optimal sample size of n^*=10 out of 100 animals generate returns from sampling of nearly $10/head, or a return-on-investment of 250%. Therefore, a large portion of the additional value for higher-quality cattle can be captured by testing a relatively small percentage of the lot. These results vary depending on the actual quality (or profitability) of a particular pen of cattle, the homogeneity within the pen, the variance of the buyer’s subjective prior distribution of expected profit, and the per-head cost of genetic testing. Nonetheless, results suggest that random sampling has the potential to provide a context in which the benefits of genetic testing outweigh the costs, which has not generally been the case in previous research.
- Published
- 2016
- Full Text
- View/download PDF
32. A BAYESIAN DECISION THEORETIC APPROACH TO FIXED SAMPLE SIZE DETERMINATION AND BLINDED SAMPLE SIZE RE-ESTIMATION FOR HYPOTHESIS TESTING
- Subjects
Blinded Sample Size Re-estimation ,Statistics ,FOS: Mathematics ,Intrinsic Loss Function ,Biostatistics ,Bayesian Decision Theory - Abstract
This thesis considers two related problems that has application in the field of experimental design for clinical trials: • fixed sample size determination for parallel arm, double-blind survival data analysis to test the hypothesis of no difference in survival functions, and • blinded sample size re-estimation for the same. For the first problem of fixed sample size determination, a method is developed generally for testing of hypothesis, then applied particularly to survival analysis; for the second problem of blinded sample size re-estimation, a method is developed specifically for survival analysis. In both problems, the exponential survival model is assumed. The approach we propose for sample size determination is Bayesian decision theoretical, using explicitly a loss function and a prior distribution. The loss function used is the intrinsic discrepancy loss function introduced by Bernardo and Rueda (2002), and further expounded upon in Bernardo (2011). We use a conjugate prior, and investigate the sensitivity of the calculated sample sizes to specification of the hyper-parameters. For the second problem of blinded sample size re-estimation, we use prior predictive distributions to facilitate calculation of the interim test statistic in a blinded manner while controlling the Type I error. The determination of the test statistic in a blinded manner continues to be nettling problem for researchers. The first problem is typical of traditional experimental designs, while the second problem extends into the realm of adaptive designs. To the best of our knowledge, the approaches we suggest for both problems have never been done hitherto, and extend the current research on both topics. The advantages of our approach, as far as we see it, are unity and coherence of statistical procedures, systematic and methodical incorporation of prior knowledge, and ease of calculation and interpretation.
- Published
- 2016
- Full Text
- View/download PDF
33. The database search problem: A question of rational decision making
- Author
-
Franco Taroni, Alex Biedermann, Silvia Bozza, and Simone Gittelson
- Subjects
Database search ,Bayesian decision theory ,Decision engineering ,Computer science ,Management science ,Decision tree ,Evidential reasoning approach ,Information Storage and Retrieval ,Decision field theory ,Decision rule ,Forensic Medicine ,Evidential decision theory ,Pathology and Forensic Medicine ,Evidential value ,Influence diagrams ,Decision Theory ,Databases as Topic ,Humans ,Influence diagram ,Law ,Probability ,Decision analysis - Abstract
This paper applies probability and decision theory in the graphical interface of an influence diagram to study the formal requirements of rationality which justify the individualization of a person found through a database search. The decision-theoretic part of the analysis studies the parameters that a rational decision maker would use to individualize the selected person. The modeling part (in the form of an influence diagram) clarifies the relationships between this decision and the ingredients that make up the database search problem, i.e., the results of the database search and the different pairs of propositions describing whether an individual is at the source of the crime stain. These analyses evaluate the desirability associated with the decision of 'individualizing' (and 'not individualizing'). They point out that this decision is a function of (i) the probability that the individual in question is, in fact, at the source of the crime stain (i.e., the state of nature), and (ii) the decision maker's preferences among the possible consequences of the decision (i.e., the decision maker's loss function). We discuss the relevance and argumentative implications of these insights with respect to recent comments in specialized literature, which suggest points of view that are opposed to the results of our study.
- Published
- 2012
34. Decision-theoretic models of visual perception and action
- Author
-
Hang Zhang and Laurence T. Maloney
- Subjects
Signal Detection, Psychological ,Statistical decision theory ,Visual perception ,Bayesian decision theory ,Bayesian probability ,Ideal observer models ,Likelihood ,Poison control ,Prior ,Bayesian inference ,Gain function ,Decision Theory ,Statistics ,Prior probability ,Humans ,Likelihood Functions ,Bayes estimator ,Models, Statistical ,Ideal (set theory) ,business.industry ,Bayes Theorem ,Loss function ,Sensory Systems ,Bayesian statistics ,Ophthalmology ,Action ,Visual Perception ,Perception ,Artificial intelligence ,business ,Psychology ,Psychomotor Performance - Abstract
Statistical decision theory (SDT) and Bayesian decision theory (BDT) are closely related mathematical frameworks used to model ideal performance in a wide range of visual and motor tasks. Their elements (gain function, likelihood, prior) are readily interpretable in terms of information available to the obser- ver. We briefly describe SDT and BDT and then review recent work employing them as models of biolog- ical perception or action. We emphasize work that employs gain functions and priors as independent or dependent variables. At one extreme, Bayesian decision theory allows the experimenter to compute ideal performance in specific tasks and compare human performance to ideal (Geisler, 1989). No claim is made that visual pro- cessing is in any sense ''Bayesian". At the other extreme, researchers have proposed Bayesian decision theory as a process model of ''perception as Bayesian inference" (Knill & Richards, 1996). We end by dis- cussing how possible ideal models are related to imperfect, actual observers and how the ''Bayesian hypothesis" can be tested experimentally. 2010 Published by Elsevier Ltd.
- Published
- 2010
35. Optimal Stopping Policy for Multivariate Sequences; a Generalized Best Choice Problem
- Author
-
M. Modarres and M.J. Samieenia
- Subjects
Asset allocation ,Bayesian decision theory ,lcsh:T ,Stochastic dynamic programming ,Mixed models ,lcsh:Technology ,Optimal stopping rule ,Best choice problem - Abstract
In the classical versions of “Best Choice Problem”, the sequence of offers is a random sample from a single known distribution. We present an extension of this problem in which the sequential offers are random variables but from multiple independent distributions. Each distribution function represents a class of investment or offers. Offers appear without any specified order. The objective is to accept the best offer. After observing each offer, the decision maker has to accept or reject it. The rejected offers cannot be recalled again. In this paper, we consider both cases of known and unknown parameters of the distribution function of the class of next offer. Two optimality criteria are considered, maximizing the expected value of the accepted offer or the probability of obtaining the best offer. We develop stochastic dynamic programming models for several possible problems, depending on the assumptions. A monotone case optimal policy for both criteria is proved. We also show that the optimal policy of a mixed sequence is similar to the one in which offers are from a single density.
- Published
- 2010
36. The Role of ICT in Improving Sequential Decisions for Water Management in Agriculture
- Author
-
Francesco Galioto, Meri Raggi, Davide Viaggi, Francesco Cavazza, Cavazza, Francesco, Galioto, Francesco, Raggi, Meri, and Viaggi, Davide
- Subjects
lcsh:Hydraulic engineering ,Bayesian decision theory ,010504 meteorology & atmospheric sciences ,Geography, Planning and Development ,Control (management) ,010501 environmental sciences ,Aquatic Science ,01 natural sciences ,Biochemistry ,lcsh:Water supply for domestic and industrial purposes ,lcsh:TC1-978 ,water management ,0105 earth and related environmental sciences ,Water Science and Technology ,2. Zero hunger ,lcsh:TD201-500 ,business.industry ,Specific-information ,15. Life on land ,Environmental economics ,Irrigated agriculture ,6. Clean water ,climate change ,13. Climate action ,Agriculture ,Information and Communications Technology ,ICT ,irrigated agriculture ,Strategic management ,ICTS ,Business ,Decision process - Abstract
Numerous Information and Communication Technologies (ICTs) applications have been developed in irrigated agriculture. While there are studies focusing on ICTs impacts at the farm level, no research deals with this issue at the Water Authority (WA) level where ICTs can support strategic decisions on land and water allocation. The present study aims to design a theoretical model to estimate economic benefits from the ICT-informed decision process of water management in agriculture. Specifically, the study analyzes the motivations driving a case study WA using ICTs to support strategic management decisions involving risky choices. Results show that the WA under investigation has potentialities to save water and to implement adaptation strategies to climate change. Higher benefits from ICTs are attainable in areas with limited water availability and where the WA can effectively manage land allocation and control water delivery volumes. The study concludes that ICTs might have a disruptive potential in fulfilling WA&rsquo, s specific information needs, but there is still a need to improve their accuracy due to the risk surrounding the decisions at stake.
- Published
- 2018
37. A structural design of clinical decision support system for chronic diseases risk management
- Author
-
Chuen-Sheng Cheng and Chi-Chang Chang
- Subjects
Bayes estimator ,Management science ,business.industry ,Evidential reasoning approach ,Decision tree ,General Medicine ,Bayesian inference ,Clinical decision support system ,clinical decision support system (cdss) ,(nhpp) chronic diseases risk management ,Risk analysis (engineering) ,bayesian decision theory ,nonhomogeneous poisson process ,Medicine ,business ,Risk management ,Optimal decision ,Decision analysis - Abstract
In clinical decision making, the event of primary interest is recurrent, so that for a given unit the event could be observed more than once during the study. In general, the successive times between failures of human physiological systems are not necessarily identically distributed. However, if any critical deterioration is detected, then the decision of when to take thei ntervention, given the costs of diagnosis and therapeutics, is of fundamental importance This paper develops a possible structural design of clinical decision support system (CDSS) by considering the sensitivity analysis as well as the optimal prior and posterior decisions for chronic diseases risk management. Indeed, Bayesian inference of a nonhomogeneous Poisson process with three different failure models (linear, exponential, and power law) were considered, and the effects of the scale factor and the aging rate of these models were investigated. In addition, we illustrate our method with an analysis of data from a trial of immunotherapy in the treatment of chronic granulomatous disease. The proposed structural design of CDSS facilitates the effective use of the computing capability of computers and provides a systematic way to integrate the expert’s opinions and the sampling information which will furnish decision makers with valuable support for quality clinical decision making.
- Published
- 2007
38. Optimal intervention for an epidemic model under parameter uncertainty
- Author
-
Nathan Green and Damian Clancy
- Subjects
Statistics and Probability ,Mathematical optimization ,Bayesian decision theory ,Bayesian probability ,Population ,Dynamic programming ,Communicable Diseases ,Mass Vaccination ,Models, Biological ,General Biochemistry, Genetics and Molecular Biology ,Disease Outbreaks ,Patient Isolation ,Bayes' theorem ,Econometrics ,Humans ,education ,Ecology, Evolution, Behavior and Systematics ,Mathematics ,Stochastic Processes ,Bayes estimator ,education.field_of_study ,Agricultural and Biological Sciences(all) ,General Immunology and Microbiology ,Markov chain ,Stochastic process ,Applied Mathematics ,Bayes Theorem ,General Medicine ,Markov Chains ,Immunisation policies ,Modeling and Simulation ,Communicable Disease Control ,General stochastic epidemic ,General Agricultural and Biological Sciences ,Epidemic model ,Algorithms - Abstract
We will be concerned with optimal intervention policies for a continuous-time stochastic SIR (susceptible → infective → removed) model for the spread of infection through a closed population. In previous work on such optimal policies, it is common to assume that model parameter values are known; in reality, uncertainty over parameter values exists. We shall consider the effect upon the optimal policy of changes in parameter estimates, and of explicitly taking into account parameter uncertainty via a Bayesian decision-theoretic framework. We consider policies allowing for (i) the isolation of any number of infectives, or (ii) the immunisation of all susceptibles (total immunisation). Numerical examples are given to illustrate our results. © 2006 Elsevier Inc. All rights reserved.
- Published
- 2007
39. Statistical contributions to code calibration and validation
- Author
-
Damblin, Guillaume, Mathématiques et Informatique Appliquées (MIA-Paris), AgroParisTech-Institut National de la Recherche Agronomique (INRA), AgroParisTech, and Éric Parent
- Subjects
[MATH.MATH-PR]Mathematics [math]/Probability [math.PR] ,Bayesian decision theory ,Bayesian calibration ,Sélection bayésienne de modèle ,Théorie de la décision bayésienne ,Validation of a computer code ,Gaussian process emulation ,Bayesian model selection ,Validation d'un code de calcul ,Emulation par processus gaussien ,Calage bayésien - Abstract
Code validation aims at assessing the uncertainty affecting the predictions of a physical system by using both the outputs of a computer code which attempt to reproduce it and the available field measurements. In the one hand, the codemay be not a perfect representation of the reality. In the other hand, some code parameters can be uncertain and need to be estimated: this issue is referred to as code calibration. After having provided a unified view of the main procedures of code validation, we propose several contributions for solving some issues arising in computer codes which are both costly and considered as black-box functions. First, we develop a Bayesian testing procedure to detect whether or not a discrepancy function, called code discrepancy, has to be taken into account between the code outputs and the physical system. Second, we present new algorithms for building sequential designs of experiments in order to reduce the error occurring in the calibration process based on a Gaussian process emulator. Lastly, a validation procedure of a thermal code is conducted as the preliminary step of a decision problem where an energy supplier has to commit for an overall energy consumption forecast to customers. Based on the Bayesian decision theory, some optimal plug-in estimators are computed.; La validation des codes de calcul a pour but d’évaluer l’incertitude de prédiction d’un système physique à partir d’un code de calcul l’approchant et des mesures physiques disponibles. D’une part, le code peut ne pas être une représentation exacte de la réalité. D’autre part, le code peut être entaché d’une incertitude affectant la valeur de certains de ses paramètres, dont l’estimation est appelée « calage de code ». Après avoir dressé un état de l’art unifié des principales procédures de calage et de validation des codes de calcul, nous proposons plusieurs contributions à ces deux problématiques lorsque le code est appréhendé comme une fonction boîte noire coûteuse. D’abord, nous développons une technique bayésienne de sélection de modèle pour tester l’existence d’une fonction d’erreur entre les réponses du code et le système physique, appelée « erreur de code ». Ensuite, nous présentons de nouveaux algorithmes destinés à la construction de plans d’expériences séquentiels afin de rendre plus précis le calage d’un code de calcul basé sur l’émulation par un processus gaussien. Enfin, nous validons un code de calcul utilisé pour prédire la consommation énergétique d’un bâtiment au cours d’une période de temps. Nous utilisons les résultats de l’étude de validation pour apporter une solution à un problème de statistique décisionnelle dans lequel un fournisseur d’électricité doit s’engager auprès de ses clients sur des prévisions moyennes de consommation. En utilisant la théorie bayésienne de la décision, des estimateurs ponctuels optimaux sont calculés.
- Published
- 2015
40. Contributions statistiques au calage et à la validation des codes de calcul
- Author
-
Damblin, Guillaume, Mathématiques et Informatique Appliquées (MIA-Paris), AgroParisTech-Institut National de la Recherche Agronomique (INRA), AgroParisTech, and Éric Parent
- Subjects
[MATH.MATH-PR]Mathematics [math]/Probability [math.PR] ,Bayesian decision theory ,Bayesian calibration ,Sélection bayésienne de modèle ,Théorie de la décision bayésienne ,Validation of a computer code ,Gaussian process emulation ,Bayesian model selection ,Validation d'un code de calcul ,Emulation par processus gaussien ,Calage bayésien - Abstract
Code validation aims at assessing the uncertainty affecting the predictions of a physical system by using both the outputs of a computer code which attempt to reproduce it and the available field measurements. In the one hand, the codemay be not a perfect representation of the reality. In the other hand, some code parameters can be uncertain and need to be estimated: this issue is referred to as code calibration. After having provided a unified view of the main procedures of code validation, we propose several contributions for solving some issues arising in computer codes which are both costly and considered as black-box functions. First, we develop a Bayesian testing procedure to detect whether or not a discrepancy function, called code discrepancy, has to be taken into account between the code outputs and the physical system. Second, we present new algorithms for building sequential designs of experiments in order to reduce the error occurring in the calibration process based on a Gaussian process emulator. Lastly, a validation procedure of a thermal code is conducted as the preliminary step of a decision problem where an energy supplier has to commit for an overall energy consumption forecast to customers. Based on the Bayesian decision theory, some optimal plug-in estimators are computed.; La validation des codes de calcul a pour but d’évaluer l’incertitude de prédiction d’un système physique à partir d’un code de calcul l’approchant et des mesures physiques disponibles. D’une part, le code peut ne pas être une représentation exacte de la réalité. D’autre part, le code peut être entaché d’une incertitude affectant la valeur de certains de ses paramètres, dont l’estimation est appelée « calage de code ». Après avoir dressé un état de l’art unifié des principales procédures de calage et de validation des codes de calcul, nous proposons plusieurs contributions à ces deux problématiques lorsque le code est appréhendé comme une fonction boîte noire coûteuse. D’abord, nous développons une technique bayésienne de sélection de modèle pour tester l’existence d’une fonction d’erreur entre les réponses du code et le système physique, appelée « erreur de code ». Ensuite, nous présentons de nouveaux algorithmes destinés à la construction de plans d’expériences séquentiels afin de rendre plus précis le calage d’un code de calcul basé sur l’émulation par un processus gaussien. Enfin, nous validons un code de calcul utilisé pour prédire la consommation énergétique d’un bâtiment au cours d’une période de temps. Nous utilisons les résultats de l’étude de validation pour apporter une solution à un problème de statistique décisionnelle dans lequel un fournisseur d’électricité doit s’engager auprès de ses clients sur des prévisions moyennes de consommation. En utilisant la théorie bayésienne de la décision, des estimateurs ponctuels optimaux sont calculés.
- Published
- 2015
41. Sequential search strategies based on kriging
- Author
-
Vazquez, Emmanuel, Méthodes d'Analyse Stochastique des Codes et Traitements Numériques (GdR MASCOT-NUM), Centre National de la Recherche Scientifique (CNRS), Laboratoire des signaux et systèmes (L2S), Université Paris-Sud - Paris 11 (UP11)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS), Université Paris-Sud, and Sebag, Michele
- Subjects
Bayesian decision theory ,[STAT.ML]Statistics [stat]/Machine Learning [stat.ML] ,[MATH.MATH-ST]Mathematics [math]/Statistics [math.ST] ,Processus gaussiens ,ComputingMethodologies_DOCUMENTANDTEXTPROCESSING ,Planification séquentielle ,Gaussian processes ,Théorie bayésienne de la décision ,Optimisation bayésienne ,[STAT.CO]Statistics [stat]/Computation [stat.CO] ,Sequential planning ,Bayesian optimization - Abstract
This manuscript has been written to obtain the French Habilitation à Diriger des Recherches. It is not intended to provide new academic results nor should it be considered as a reference textbook. Instead, this manuscript is a brief (and incomplete) summary of my teaching and research activities. You will find in this manuscript a compilation of some articles in which I had a significant contribution, together with some introductory paragraphs about sequential search strategies based on kriging.
- Published
- 2015
42. Building Domain Specific Sentiment Lexicons Combining Information from Many Sentiment Lexicons and a Domain Specific Corpus
- Author
-
Aleksander Bai, Anis Yazidi, Paal E. Engelstad, and Hugo Lewi Hammer
- Subjects
Bayesian decision theory ,Sentiment classification ,business.industry ,Computer science ,media_common.quotation_subject ,Sentiment analysis ,Cross-domain ,Matematikk og Naturvitenskap: 400::Matematikk: 410::Anvendt matematikk: 413 [VDP] ,Teknologi: 500::Informasjons- og kommunikasjonsteknologi: 550::Annen informasjonsteknologi: 559 [VDP] ,computer.software_genre ,Lexicon ,Domain (software engineering) ,Sentiment lexicon ,Order (business) ,Quality (business) ,Artificial intelligence ,business ,Random variable ,computer ,Assignment problem ,Natural language processing ,Word (computer architecture) ,media_common - Abstract
Most approaches to sentiment analysis requires a sentiment lexicon in order to automatically predict sentiment or opinion in a text. The lexicon is generated by selecting words and assigning scores to the words, and the performance the sentiment analysis depends on the quality of the assigned scores. This paper addresses an aspect of sentiment lexicon generation that has been overlooked so far; namely that the most appropriate score assigned to a word in the lexicon is dependent on the domain. The common practice, on the contrary, is that the same lexicon is used without adjustments across different domains ignoring the fact that the scores are normally highly sensitive to the domain. Consequently, the same lexicon might perform well on a single domain while performing poorly on another domain, unless some score adjustment is performed. In this paper, we advocate that a sentiment lexicon needs some further adjustments in order to perform well in a specific domain. In order to cope with these domain specific adjustments, we adopt a stochastic formulation of the sentiment score assignment problem instead of the classical deterministic formulation. Thus, viewing a sentiment score as a stochastic variable permits us to accommodate to the domain specific adjustments. Experimental results demonstrate the feasibility of our approach and its superiority to generic lexicons without domain adjustments.
- Published
- 2015
43. Bayesian Dose-Response Modeling in Sparse Data
- Author
-
Kim, Steven
- Subjects
Hormesis ,Bayesian decision theory ,Sequential decisions ,Statistics ,Phase I clinical trials ,Dose-response modeling ,Adaptive designs - Abstract
This book discusses Bayesian dose-response modeling in small samples applied to two different settings. The first setting is early phase clinical trials, and the second setting is toxicology studies in cancer risk assessment. In early phase clinical trials, experimental units are humans who are actual patients. Prior to a clinical trial, opinions from multiple subject area experts are generally more informative than the opinion of a single expert, but we may face a dilemma when they have disagreeing prior opinions. In this regard, we consider compromising the disagreement and compare two different approaches for making a decision. In addition to combining multiple opinions, we also address balancing two levels of ethics in early phase clinical trials. The first level is individual-level ethics which reflects the perspective of trial participants. The second level is population-level ethics which reflects the perspective of future patients. We extensively compare two existing statistical methods which focus on each perspective and propose a new method which balances the two conflicting perspectives. In toxicology studies, experimental units are living animals. Here we focus on a potential non-monotonic dose-response relationship which is known as hormesis. Briefly, hormesis is a phenomenon which can be characterized by a beneficial effect at low doses and a harmful effect at high doses. In cancer risk assessments, the estimation of a parameter, which is known as a benchmark dose, can be highly sensitive to a class of assumptions, monotonicity or hormesis. In this regard, we propose a robust approach which considers both monotonicity and hormesis as a possibility. In addition, We discuss statistical hypothesis testing for hormesis and consider various experimental designs for detecting hormesis based on Bayesian decision theory. Past experiments have not been optimally designed for testing for hormesis, and some Bayesian optimal designs may not be optimal under a wrong parametric assumption. In this regard, we consider a robust experimental design which does not require any parametric assumption.
- Published
- 2015
44. Whither Broad or Spatially Specific Fertilizer Recommendations?
- Author
-
Mkondiwa, Maxwell Gibson
- Subjects
Productivity Analysis ,Malawi ,Production Economics ,Bayesian decision theory ,spatial heterogeneity ,Crop Production/Industries ,spatial scale stochastic dominance - Abstract
Are spatially specific agricultural input use recommendations more profitable to smallholder farmers than broad recommendations? This paper provides a theoretical and empirical modeling procedure for determining the optimal spatial scale at which agricultural researchers can make soil fertility recommendations. Theoretically, the use of Bayesian decision theory in the spatial economic optimization model allows the complete characterization of the posterior distribution functions of profits thereby taking into account spatial heterogeneity and uncertainty in the decision making process. By applying first order spatial scale stochastic dominance and Jensen’s inequality; theoretically and empirically, this paper makes the case that spatially specific agricultural input use recommendations will always stochastically dominate broad recommendations for all non-decreasing profit functions ignoring the quasi-fixed cost differentials in the decision itself. These findings are consistent with many economic studies that find precision agriculture technologies to be more profitable than conventional fertilizer (regional or national recommendations based) application approaches. The modeling approach used in this study however provides an elegant theoretical justification for such results. In addition, seasonal heterogeneity in maize responses was evident in our results. This demonstrates that broad recommendations may not only be wrong spatially but also seasonally. Further research on the empirical aspects of spatio-temporal instability of crop responses to fertilizer application using multi-location and multi-season data is needed to fully address the question posed initially. The decision making theory developed here can however be extended to incorporate spatio-temporal heterogeneity and alternative risk preferences.
- Published
- 2015
- Full Text
- View/download PDF
45. Estimating the Expected Value of Sample Information Using the Probabilistic Sensitivity Analysis Sample: A Fast, Nonparametric Regression-Based Method
- Author
-
Penny Breeze, Jeremy E. Oakley, Alan Brennan, and Mark Strong
- Subjects
Mathematical optimization ,Bayesian decision theory ,Monte Carlo method ,Posterior probability ,Sample (statistics) ,generalized additive model ,Statistics, Nonparametric ,Decision Support Techniques ,computational methods ,economic evaluation model ,Joint probability distribution ,Expected value of sample information ,Statistics ,Humans ,Mathematics ,Probability ,Bayes estimator ,Health Policy ,Decision Trees ,Monte Carlo methods ,Bayes Theorem ,Original Articles ,Nonparametric regression ,nonparametric regression ,Regression Analysis ,Decision model ,Monte Carlo Method ,expected value of sample information - Abstract
Health economic decision-analytic models are used to estimate the expected net benefits of competing decision options. The true values of the input parameters of such models are rarely known with certainty, and it is often useful to quantify the value to the decision maker of reducing uncertainty through collecting new data. In the context of a particular decision problem, the value of a proposed research design can be quantified by its expected value of sample information (EVSI). EVSI is commonly estimated via a 2-level Monte Carlo procedure in which plausible data sets are generated in an outer loop, and then, conditional on these, the parameters of the decision model are updated via Bayes rule and sampled in an inner loop. At each iteration of the inner loop, the decision model is evaluated. This is computationally demanding and may be difficult if the posterior distribution of the model parameters conditional on sampled data is hard to sample from. We describe a fast nonparametric regression-based method for estimating per-patient EVSI that requires only the probabilistic sensitivity analysis sample (i.e., the set of samples drawn from the joint distribution of the parameters and the corresponding net benefits). The method avoids the need to sample from the posterior distributions of the parameters and avoids the need to rerun the model. The only requirement is that sample data sets can be generated. The method is applicable with a model of any complexity and with any specification of model parameter distribution. We demonstrate in a case study the superior efficiency of the regression method over the 2-level Monte Carlo method.
- Published
- 2014
46. Toward a New Application of Real-Time Electrophysiology: Online Optimization of Cognitive Neurosciences Hypothesis Testing
- Author
-
Jérémie Mattout, Olivier F. Bertrand, Emmanuel Maby, Gaëtan Sanchez, Aline Elisabeth Dominque Bompas, Jean Daunizeau, Centre de Recherche de l'Institut du Cerveau et de la Moelle épinière (CRICM), Centre National de la Recherche Scientifique (CNRS)-Institut National de la Santé et de la Recherche Médicale (INSERM)-Université Pierre et Marie Curie - Paris 6 (UPMC), Centre de recherche en neurosciences de Lyon (CRNL), Université Claude Bernard Lyon 1 (UCBL), Université de Lyon-Université de Lyon-Université Jean Monnet [Saint-Étienne] (UJM)-Institut National de la Santé et de la Recherche Médicale (INSERM)-Centre National de la Recherche Scientifique (CNRS), School of Psychology, Université Pierre et Marie Curie - Paris 6 (UPMC)-Institut National de la Santé et de la Recherche Médicale (INSERM)-Centre National de la Recherche Scientifique (CNRS), Centre de recherche en neurosciences de Lyon - Lyon Neuroscience Research Center (CRNL), and Université de Lyon-Université de Lyon-Université Jean Monnet - Saint-Étienne (UJM)-Institut National de la Santé et de la Recherche Médicale (INSERM)-Centre National de la Recherche Scientifique (CNRS)
- Subjects
Computer science ,Bayesian probability ,Cognitive neuroscience ,Bayesian inference ,050105 experimental psychology ,Article ,lcsh:RC321-571 ,03 medical and health sciences ,brain-computer interfaces ,cognitive neuroscience ,0302 clinical medicine ,adaptive design optimization ,hypothesis testing ,real-time electrophysiology ,Bayesian model comparison ,Bayesian Decision Theory ,generative models of brain functions ,0501 psychology and cognitive sciences ,lcsh:Neurosciences. Biological psychiatry. Neuropsychiatry ,Statistical hypothesis testing ,Brain–computer interface ,business.industry ,General Neuroscience ,Model selection ,[SCCO.NEUR]Cognitive science/Neuroscience ,05 social sciences ,Cognition ,Statistical model ,Artificial intelligence ,business ,030217 neurology & neurosurgery - Abstract
International audience; Brain-computer interfaces (BCIs) mostly rely on electrophysiological brain signals. Methodological and technical progress has largely solved the challenge of processing these signals online. The main issue that remains, however, is the identification of a reliable mapping between electrophysiological measures and relevant states of mind. This is why BCIs are highly dependent upon advances in cognitive neuroscience and neuroimaging research. Recently, psychological theories became more biologically plausible, leading to more realistic generative models of psychophysiological observations. Such complex interpretations of empirical data call for efficient and robust computational approaches that can deal with statistical model comparison, such as approximate Bayesian inference schemes. Importantly, the latter enable the optimization of a model selection error rate with respect to experimental control variables, yielding maximally powerful designs. In this paper, we use a Bayesian decision theoretic approach to cast model comparison in an online adaptive design optimization procedure. We show how to maximize design efficiency for individual healthy subjects or patients. Using simulated data, we demonstrate the face- and construct-validity of this approach and illustrate its extension to electrophysiology and multiple hypothesis testing based on recent psychophysiological models of perception. Finally, we discuss its implications for basic neuroscience and BCI itself.
- Published
- 2014
- Full Text
- View/download PDF
47. Bayesian networks for probabilistic inference and decision analysis in forensic science
- Author
-
Colin Aitken, Silvia Bozza, Alex Biedermann, Franco Taroni, and Paolo Garbolino
- Subjects
Bayesian decision theory ,business.industry ,Computer science ,Bayesian networks ,Forensic science ,Bayesian network ,Probabilistic inference ,Machine learning ,computer.software_genre ,Bayesian statistics ,Frequentist inference ,Fiducial inference ,Influence diagram ,Artificial intelligence ,business ,computer ,Decision analysis - Published
- 2014
48. Decision analysis for the genotype designation in low-template-DNA profiles
- Author
-
Franco Taroni, Simone Gittelson, Silvia Bozza, and Alex Biedermann
- Subjects
Bayes estimator ,Genotype ,Models, Genetic ,Bayesian decision theory ,Bayes Theorem ,Influence diagram ,Replicate ,DNA ,Decision problem ,DNA Fingerprinting ,Pathology and Forensic Medicine ,Decision Support Techniques ,Electropherogram ,Low-level-DNA threshold ,Statistics ,Genetics ,Probability distribution ,Humans ,Algorithm ,Expected loss ,Alleles ,Decision analysis ,Mathematics - Abstract
What genotype should the scientist specify for conducting a database search to try to find the source of a low-template-DNA (lt-DNA) trace? When the scientist answers this question, he or she makes a decision. Here, we approach this decision problem from a normative point of view by defining a decision-theoretic framework for answering this question for one locus. This framework combines the probability distribution describing the uncertainty over the trace's donor's possible genotypes with a loss function describing the scientist's preferences concerning false exclusions and false inclusions that may result from the database search. According to this approach, the scientist should choose the genotype designation that minimizes the expected loss. To illustrate the results produced by this approach, we apply it to two hypothetical cases: (1) the case of observing one peak for allele xi on a single electropherogram, and (2) the case of observing one peak for allele xi on one replicate, and a pair of peaks for alleles xi and xj, i ≠ j, on a second replicate. Given that the probabilities of allele drop-out are defined as functions of the observed peak heights, the threshold values marking the turning points when the scientist should switch from one designation to another are derived in terms of the observed peak heights. For each case, sensitivity analyses show the impact of the model's parameters on these threshold values. The results support the conclusion that the procedure should not focus on a single threshold value for making this decision for all alleles, all loci and in all laboratories.
- Published
- 2014
49. The relatively small decline in orientation acuity as stimulus size decreases
- Author
-
J. Andrew Henrie and Robert Shapley
- Subjects
Adult ,Male ,Visual acuity ,Psychometrics ,Bayesian decision theory ,genetic structures ,Sinusoidal gratings ,media_common.quotation_subject ,Visual Acuity ,Differential Threshold ,Stimulus (physiology) ,Bayes' theorem ,Discrimination, Psychological ,Optics ,Gabor filter ,Orientation ,Perception ,Statistics ,medicine ,Humans ,Orientation acuity ,Size Perception ,Visual Cortex ,media_common ,Bayes estimator ,business.industry ,Bayes Theorem ,Middle Aged ,Sensory Systems ,Ophthalmology ,Logistic Models ,Visual cortex ,medicine.anatomical_structure ,medicine.symptom ,business ,Psychology - Abstract
Orientation acuity was measured with circular patches of sinusoidal gratings of various sizes. Threshold estimates were lowest (acuity highest) for the largest size patch, and increased as the stimulus size was reduced, consistent with the results of many researchers using line stimuli. These results are compared with the predictions of a simple and widely accepted model of spatial vision whereby the output of independent feed-forward filters are combined to produce threshold estimates. Specifically, the rectified output of a number of independent filters (i.e. Gabors) spanning the stimulus space (i.e. orientation) are combined via Bayesian decision theory. This model cannot account quantitatively for the relatively low thresholds estimated for the small sized stimuli when compared to the thresholds measured with larger patches. Application of a comparable analysis, with preliminary measurements of neuronal responses from primary visual cortex replacing the response rectified Gabor filter's responses, provides a more reasonable account of behavioral acuity. This indicates a fundamental inadequacy of the feed-forward filter model in accounting for V1 neurons' role in perception.
- Published
- 2001
50. Applications of Bayesian Decision Theory to Sequential Mastery Testing
- Author
-
Hans J. Vos and Faculty of Behavioural, Management and Social Sciences
- Subjects
Bayesian decision theory ,Bayesian probability ,Dynamic programming ,Machine learning ,computer.software_genre ,01 natural sciences ,Education ,010104 statistics & probability ,0504 sociology ,Econometrics ,0101 mathematics ,Beta distribution ,Mathematics ,Structure (mathematical logic) ,Bayes estimator ,business.industry ,05 social sciences ,050401 social sciences methods ,Beta-binomial model ,Binomial distribution ,Bayesian statistics ,Monotonicity conditions ,Beta-binomial distribution ,Sequential mastery testing ,Artificial intelligence ,business ,computer ,Social Sciences (miscellaneous) - Abstract
The purpose of this paper is to formulate optimal sequential rules for mastery tests. The framework for the approach is derived from Bayesian sequential decision theory. Both a threshold and linear loss structure are considered. The binomial probability distribution is adopted as the psychometric model involved. Conditions sufficient for sequentially setting optimal cutting scores are presented. Optimal sequential rules will be derived for the case of a subjective beta distribution representing prior true level of functioning. An empirical example of sequential mastery esting for concept-learning in medicine concludes the paper.
- Published
- 1999
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.