SYSTEMS engineering, SYSTEMS development, PRODUCTION engineering, ARCHITECTURAL designs
Abstract
During the review process for the paper "Architecting Portfolios of Systems" in this issue one of the reviewers suggested that part of the reviewer‐author dialog be published because it clarified some issues beyond the paper itself and illustrated some different perspectives. This paper elaborates three major topics: The categorization of different types of collections of systems, the concept of requisite modeling in how we choose to model systems under development across a spectrum of rigor, and how to judge the success of research in systems engineering processes and methods. [ABSTRACT FROM AUTHOR]
Bolam, Friederike C., Grainger, Matthew J., Mengersen, Kerrie L., Stewart, Gavin B., Sutherland, William J., Runge, Michael C., and McGowan, Philip J. K.
Subjects
BIODIVERSITY conservation, ADAPTIVE natural resource management, DECISION making in science, ENVIRONMENTAL sciences, PUBLIC health
Abstract
Conservation decisions are challenging, not only because they often involve difficult conflicts among outcomes that people value, but because our understanding of the natural world and our effects on it is fraught with uncertainty. Value of Information (VoI) methods provide an approach for understanding and managing uncertainty from the standpoint of the decision maker. These methods are commonly used in other fields (e.g. economics, public health) and are increasingly used in biodiversity conservation. This decision‐analytical approach can identify the best management alternative to select where the effectiveness of interventions is uncertain, and can help to decide when to act and when to delay action until after further research. We review the use of VoI in the environmental domain, reflect on the need for greater uptake of VoI, particularly for strategic conservation planning, and suggest promising areas for new research. We also suggest common reporting standards as a means of increasing the leverage of this powerful tool. The environmental science, ecology and biodiversity categories of the Web of Knowledge were searched using the terms 'Value of Information,' 'Expected Value of Perfect Information,' and the abbreviation 'EVPI.' Google Scholar was searched with the same terms, and additionally the terms decision and biology, biodiversity conservation, fish, or ecology. We identified 1225 papers from these searches. Included studies were limited to those that showed an application of VoI in biodiversity conservation rather than simply describing the method. All examples of use of VOI were summarised regarding the application of VoI, the management objectives, the uncertainties, the models used, how the objectives were measured, and the type of VoI. While the use of VoI appears to be on the increase in biodiversity conservation, the reporting of results is highly variable, which can make it difficult to understand the decision context and which uncertainties were considered. Moreover, it was unclear if, and how, the papers informed management and policy interventions, which is why we suggest a range of reporting standards that would aid the use of VoI. The use of VoI in conservation settings is at an early stage. There are opportunities for broader applications, not only for species‐focussed management problems, but also for setting local or global research priorities for biodiversity conservation, making funding decisions, or designing or improving protected area networks and management. The long‐term benefits of applying VoI methods to biodiversity conservation include a more structured and decision‐focused allocation of resources to research. [ABSTRACT FROM AUTHOR]
HEALTH care industry, DEBATE, ATTITUDE (Psychology), MEDICAL care, EVIDENCE-based medicine, PATIENT-centered care, MEDICAL personnel, CONFLICT (Psychology), DECISION making in clinical medicine
Abstract
Evidence‐based medicine (EBM), one of the most important movements in health care, has been a lightning rod for controversy. Conflicts about the meaning and value of EBM are owing in part to lack of clarity about basic questions regarding its development, the importance of expertise and intuition, and the role of evidence in clinical decision making. These issues have persisted in part because of unclarity at the outset, but also because of how EBM evolved, why it was introduced when it was, and how it was modified following its introduction. This paper traces the evolution of EBM from clinical epidemiology (CE) and the internal dispute that precipitated the developers to establish EBM as a distinct approach to clinical practice. The paper proposes that health care industrialization also had a significant role in EBM's emergence and that industrialization influenced the decision to merge EBM with the method of normative decision making known as decision analysis (DA). The paper discusses the impact of this merger, in particular how it led to EBM's identification with managed care and has added momentum to the effort at forging a connection between a normative decision model and clinical judgement. This effort would turn clinical decision making into a conduit for bringing administrative rules and regulations into the consulting room and would result in expertise becoming a surplus skill. The paper closes by discussing a challenge yet unmet by EBM's advocates and critics—to chronicle the dangers that EBM in the framework of DA during the current era of industrialization poses to health and health care, and discover ways of unhinging the relationship between model and judgement. [ABSTRACT FROM AUTHOR]
Some e‐platforms such as Amazon and JD.com are increasingly active in financing business, but theoretical research on this business is not comprehensive enough. This paper uses game theory model to compare the equilibrium of e‐platform financing (EPF) and bank financing (BF) in terms of pricing, quality, suppliers' competition, participants' profits, and consumer welfare. We find that a monopolistic supplier can always enjoy a lower loan interest rate and borrow more loans to improve production quantity or quality under EPF, which benefits to the participants and consumers. Interestingly, the e‐platform provides free loans and only profits from commission fee under some conditions. When there are two suppliers competing for the market share, the absolute advantage of EPF will change. If the supply chain or market environment is bad (production cost or commission rate is high; the potential market or initial budget is low), the e‐platform will offer lower interest rates than bank to reduce the strength gap between the suppliers; thus, the capital‐constrained supplier prefers EPF, which shows that the EPF business is an effective means to regulate upstream competition. However, when there are more competitive suppliers, the preferences of the suppliers with less initial capital are the same as the scenario of the two competitive suppliers, while the preference of the supplier with more initial capital is just the opposite. Finally, if the commission rate is endogenous, we demonstrate that the e‐platform will offer free loans but charge a relatively high commission fee. [ABSTRACT FROM AUTHOR]
BUDGET management, SYSTEM of systems, SYSTEMS development, ARCHITECTURAL designs, DECISION making, SATELLITE meteorology
Abstract
A "portfolio‐of‐systems" is a collection of systems related by common management and budgeting, and at least some relationship to mission or missions. A portfolio‐of‐systems describes the not uncommon situation where an office is responsible for multiple systems in multiple phases of development but those systems are not related by common design, common manufacturing, or necessarily interconnected operationally. The relationship is fundamentally one of management, although other linkages might be present or designed if such additional linkages are deemed beneficial. The portfolio‐of‐systems concept differs from systems‐of‐systems and product‐line or family‐of‐systems constructs. This paper describes generalizations of architecting techniques for portfolio situations, particularly recognizing the presence of many largely independent components in multiple stages of development with an ongoing or episodic process. The paper presents two case studies of actual portfolios: One a portfolio of operational demonstration systems and the other a portfolio of weather satellites. [ABSTRACT FROM AUTHOR]
In commemorating the 40th anniversary of Risk Analysis, this article takes a retrospective look at some of the ways in which decision analysis (as a "sibling field") has contributed to the development both of the journal, and of risk analysis as a field. I begin with some early foundational papers from the first decade of the journal's history. I then review a number of papers that have applied decision analysis to risk problems over the years, including applications of related methods such as influence diagrams, multicriteria decision analysis, and risk matrices. The article then reviews some recent trends, from roughly the last five years, and concludes with observations about the parallel evolution of risk analysis and decision analysis over the decades—especially with regard to the importance of representing multiple stakeholder perspectives, and the importance of behavioral realism in decision models. Overall, the extensive literature surveyed here supports the view that the incorporation of decision‐analytic perspectives has improved the practice of risk analysis. [ABSTRACT FROM AUTHOR]
KNAPSACK problems, LINEAR programming, BACKPACKS, GREEDY algorithms, WATER-pipes
Abstract
Lead pipe remediation budgets are limited and ought to maximize public health impact. This goal implies a nontrivial optimization problem; lead service lines connect water mains to individual houses, but any realistic replacement strategy must batch replacements at a larger scale. Additionally, planners typically lack a principled method for comparing the relative public health value of potential interventions and often plan projects based on nonhealth factors. This paper describes a simple process for estimating child health impact at a parcel level by cleaning and synthesizing municipal datasets that are commonly available but seldom joined due to data quality issues. Using geocoding as the core record linkage mechanism, parcel‐level toxicity data can be combined with school enrollment records to indicate where young children and lead lines coexist. A harm metric of estimated exposure‐years is described at the parcel level, which can then be aggregated to the project level and minimized globally by posing project selection as a 0/1 knapsack problem. Simplifying for use by nonexperts, the implied linear programming relaxation is solved with the greedy algorithm; ordering projects by benefit cost ratio produces a priority list that planners can then consider holistically alongside harder to quantify factors. A case study demonstrates the successful application of this framework to a small U.S. city's existing data to prioritize federal infrastructure funding. [ABSTRACT FROM AUTHOR]
ABSTRACT This paper presents a network-based approach for analyzing team coordination and shared cognition in engineering design teams. The research setting is an Integrated Concurrent Engineering (ICE) laboratory in which teams of approximately 20 engineers are collocated to conduct rapid conceptual design of scientific spacecraft. A design structure matrix (DSM) of expected interactions is constructed from technical information flow data, and DSM representations of reported interactions are created using survey data from 10 ICE design teams. A comparative analysis of expected and reported interactions is used to calculate a metric of team coordination called socio-technical congruence (STC). To examine shared cognition, pairwise shared mental models (SMMs) are measured using pre- and post-session surveys on system design drivers. Shared knowledge networks (SMMs as edges) are constructed, and team learning is measured as the change in network structure over time. Analysis reveals statistically significant correlations between team learning and each of three technical attributes (system development time, launch mass, and mission concept maturity) and between team learning and team coordination. These results indicate that team members learn most from each other when working on difficult or unfamiliar problems and when expected and reported interactions are aligned. The paper concludes that team coordination and the design product in ICE are not necessarily directly related to each other but that both are related to shared cognition. Although this study focuses on conceptual design, it lays the foundation for future work examining the role of team coordination and shared cognition in full-scale system development programs. [ABSTRACT FROM AUTHOR]
Ekin, Tahir, Ieva, Francesca, Ruggeri, Fabrizio, and Soyer, Refik
Subjects
MEDICAL care costs, FRAUD, DECISION making, REGIONAL medical programs, BAYESIAN analysis
Abstract
Summary: Health care expenditures constitute a significant portion of governmental budgets. The percentage of fraud, waste and abuse within that spending has increased over years. This paper introduces the emerging area of statistical medical fraud assessment, which becomes crucial to handle the increasing size and complexity of the medical programmes. An overview of fraud types and detection is followed by the description of medical claims data. The utilisation of sampling, overpayment estimation and data mining methods in medical fraud assessment are presented. Recent unsupervised methods are illustrated with real world data. Finally, the paper introduces potential future research areas such as integrated decision making approaches and Bayesian methods and concludes with an overall discussion. The main goal of this exposition is to increase awareness about this important area among a broader audience of statisticians. [ABSTRACT FROM AUTHOR]
Abstract: Due to increasingly strict environmental regulation of marine transportation, vessel operators and other stakeholders are required to evaluate feasible compliance measures in the face of multiple criteria and with attention to uncertainties and risks. Several methods and models within operations research have been applied to explore such decision contexts, but little is reported on the problem structure itself and the key values, concerns, and uncertainties that apply to them. The objective of the paper is to present a problem structure for acquisition of marine emission reduction technologies in the Norwegian ferry fleet drawing on methods from decision science and systems engineering. To attain this objective, we utilize the SPADE methodology, which details five problem‐solving activities covering stakeholders, problem formulation, alternatives, decisions, and continuous evaluation. Each activity is informed by data collected through stakeholder interviews and literature analysis to establish an initial representation of acquisition decision issues. To keep a consistent and traceable problem structure, we provide a stakeholder diagram, value network, systemigram, and decision hierarchy centered around stakeholders and their values. These models may serve to inform decision‐makers in the development and appraisal of emission reduction technologies and strategies. The paper demonstrates the application of systems engineering as a problem‐structuring framework for complex, multidimensional marine technology acquisition decisions. [ABSTRACT FROM AUTHOR]
DRUG laws, DECISION making, DECISION theory, QUANTITATIVE research
Abstract
The recent benefit–risk framework (BRF) developed by the Food and Drug Administration (FDA) is intended to improve the clarity and consistency in communicating the reasoning behind the FDA's decisions, acting as an important advancement in US drug regulation. In the PDUFA VI implementation plan, the FDA states that it will continue to explore more structured or quantitative decision analysis approaches; however, it restricts their use within the current BRF that is purely qualitative. By contrast, European regulators and researchers have been long exploring the use of quantitative decision analysis approaches for evaluating drug benefit–risk balance. In this paper, we show how quantitative modelling, backed by decision theory, could complement and extend the FDA's BRF to better support the appraisal of evidence and improve decision outcomes. After providing relevant scientific definitions for benefit–risk assessment and describing the FDA and European Medicines Agency (EMA) frameworks, we explain the components of and differences between qualitative and quantitative approaches. We present lessons learned from the EMA experience with the use of quantitative modelling and we provide evidence of its benefits, illustrated by a real case study that helped to resolve differences of judgements among EMA regulators. [ABSTRACT FROM AUTHOR]
Musalem, Andres, Montoya, Ricardo, Meißner, Martin, and Huber, Joel
Subjects
TASKS, DECISION making
Abstract
This paper identifies four attentional processes that increase efficiency and accuracy in repeated lexicographic tasks using an instructed strategy approach. We propose a framework to decompose attentional effort used to make a decision into four components: Orientation, Wrong Target, Duration, and Repetition. Orientation assesses attention to decision rules and the location of relevant information. Wrong Target measures wasted effort on unneeded information. Duration gauges time spent on each piece of needed information. Repetition measures the number of views on each relevant item. Greater Orientation is associated with lower effort in other components and increased accuracy. Repetition is most variable across individuals but generates the greatest improvement with practice. Duration is less affected by the other components and shows minimal improvement with experience. Finally, Wrong Target is similarly resistant to practice, but it is the only component strongly and positively associated with making errors. [ABSTRACT FROM AUTHOR]
Yu, Michael, Darton, Thomas C., and Kimmelman, Jonathan
Subjects
INFECTION prevention, INFECTION risk factors, BIOETHICS, DECISION making, INFECTION, EVALUATION of medical care, NEW product development, RISK assessment, VACCINES
Abstract
Risks and benefit evaluation for controlled human infection studies, where healthy volunteers are deliberately exposed to infectious agents to evaluate vaccine efficacy, should be explicit, systematic, thorough, and non‐arbitrary. Decision analysis promotes these qualities using four steps: (1) determining explicit criteria and measures for evaluation, (2) identifying alternatives to the study, (3) defining the models used to estimate the measures for each alternative, and (4) running the models to produce the estimates and compare the alternatives. In this paper, we describe how decision analysis might be applied by funders and regulators, as well as by others contemplating the use of novel controlled human infection studies for vaccine development and evaluation. [ABSTRACT FROM AUTHOR]
Costa, Ana Sara, Rui Figueira, José, Vieira, Carlos Rodrigues, and Vieira, Isabel Viegas
Subjects
FINANCIAL crises, TAX & expenditure limitations, ECONOMIC impact, PUBLIC welfare
Abstract
In the 2010 Toronto summit, the leaders of the G‐20 countries agreed on the implementation of urgent fiscal consolidation plans, following the expansionary policies adopted to curb the recessionary effects of the financial crisis. Unprecedented cuts in public expenditures have taken place, particularly in the European Union periphery, reviving the discussion on the optimal size of the public sector. Supporters of fiscal restraint defend that bigger governments tend to be more inefficient, its opponents assume that government size determines the effectiveness of its performance. However, the social and economic impacts from contractionary fiscal policies ultimately depend on the level of public sector efficiency. Relatively inefficient governments have more scope to consolidate without compromising social welfare. In this paper, we adopt a multiple decision aiding approach, not previously employed for the assessment of complex macroeconomic performances, and employ the ELECTRE TRI‐C outranking method to categorize OECD countries on a set of criteria representing the quality of their public sectors. We then compare the obtained classifications with the share of each government's expenditures on GDP, to identify distinct levels of efficiency. Our analysis suggests that various countries exhibit a margin for efficiency gains, attenuating the social and economic effects of fiscal consolidation policies. [ABSTRACT FROM AUTHOR]
Lund, Jay R., Brekke, Levi, Yu, Winston, Reed, Patrick M., Hall, Jim, Brown, Casey M., Cai, Ximing, Zagona, Edith A., Ostfeld, Avi, and Characklis, Gregory W.
Subjects
WATER supply, WATER supply research, WATER management, SUSTAINABILITY, CLIMATE change, HISTORY
Abstract
This paper presents a short history of water resources systems analysis from its beginnings in the Harvard Water Program, through its continuing evolution toward a general field of water resources systems science. Current systems analysis practice is widespread and addresses the most challenging water issues of our times, including water scarcity and drought, climate change, providing water for food and energy production, decision making amid competing objectives, and bringing economic incentives to bear on water use. The emergence of public recognition and concern for the state of water resources provides an opportune moment for the field to reorient to meet the complex, interdependent, interdisciplinary, and global nature of today's water challenges. At present, water resources systems analysis is limited by low scientific and academic visibility relative to its influence in practice and bridled by localized findings that are difficult to generalize. The evident success of water resource systems analysis in practice (which is set out in this paper) needs in future to be strengthened by substantiating the field as the science of water resources that seeks to predict the water resources variables and outcomes that are important to governments, industries, and the public the world over. Doing so promotes the scientific credibility of the field, provides understanding of the state of water resources and furnishes the basis for predicting the impacts of our water choices. [ABSTRACT FROM AUTHOR]
Borgomeo, Edoardo, Mortazavi‐Naeini, Mohammad, Hall, Jim W., and Guillod, Benoit P.
Subjects
CLIMATE change, WATER supply management, ENVIRONMENTAL management
Abstract
Risk‐based water resources planning is based on the premise that water managers should invest up to the point where the marginal benefit of risk reduction equals the marginal cost of achieving that benefit. However, this cost‐benefit approach may not guarantee robustness under uncertain future conditions, for instance under climatic changes. In this paper, we expand risk‐based decision analysis to explore possible ways of enhancing robustness in engineered water resources systems under different risk attitudes. Risk is measured as the expected annual cost of water use restrictions, while robustness is interpreted in the decision‐theoretic sense as the ability of a water resource system to maintain performance—expressed as a tolerable risk of water use restrictions—under a wide range of possible future conditions. Linking risk attitudes with robustness allows stakeholders to explicitly trade‐off incremental increases in robustness with investment costs for a given level of risk. We illustrate the framework through a case study of London's water supply system using state‐of‐the ‐art regional climate simulations to inform the estimation of risk and robustness. [ABSTRACT FROM AUTHOR]
Yousaf, Salman, Ali, Yousaf, Sabir, Muhammad, and Masood, Muhammad Tahir
Subjects
PRODUCTION planning, MULTIPLE criteria decision making, PRODUCT management, LINEAR programming, PHYSICAL distribution of goods
Abstract
The main purpose of this paper is to improve the production planning of Pakistan Tobacco Company by selecting the most preferred brand and subsequently generating maximum profit from it. As the company produces a variety of products, the technique of multi criteria decision making is used to select the most preferred brand. To generate the maximum output from the preferred brand, different methods of qualitative managerial analysis are used, which include decision analysis to decide 'why and where' the manufacturing should be carried out, transportation model to minimize the logistics cost while meeting the demand, and linear programming technique to maximize the profit generated in 2014-2015. The result obtained from analytical hierarchy process shows that the most preferred brand of the company with respect to price, quality, and comfort is John Player Gold Leaf. The decision analysis explains that this brand should be manufactured in the Jhelum factory of the company as it is more cost-effective to produce and there is a high availability of resources. Transportation model minimizes the logistics cost of this brand from the 2 factories while meeting the demand at each provinces, central warehouse. Linear programming contributes in generating a profit of 32.738 billion PKR with an amount of 0.35 million PKR, which is more than that of the current profit of the company in the year. These results will allow the top management of the company to take corrective decisions well in time, gain a core competency in cost reduction, and make the supply chain process more efficient. [ABSTRACT FROM AUTHOR]
A predicted difficult airway is sometimes considered a contra-indication to rapid sequence induction of general anaesthesia, even in an urgent case such as a category-1 caesarean section for fetal distress. However, formally assessing the risk is difficult because of the rarity and urgency of such cases. We have used decision analysis to quantify the time taken to establish anaesthesia, and probability of failure, of three possible anaesthetic methods, based on a systematic review of the literature. We considered rapid sequence induction of general anaesthesia with videolaryngoscopy, awake fibreoptic intubation and rapid spinal anaesthesia. Our results show a shorter mean (95% CI) time to induction of 100 (87-114) s using rapid sequence induction compared with 9 (7-11) min for awake fibreoptic intubation (p < 0.0001) and 6.3 (5.4-7.2) min for spinal anaesthesia (p < 0.0001). We calculate the risk of ultimate failed airway control after rapid sequence induction to be 21 (0-53) per 100,000 cases, and postulate that some mothers may accept such a risk in order to reduce potential fetal harm from an extended time interval until delivery. Although rapid sequence induction may not be the anaesthetic technique of choice for all cases in the circumstance of a category-1 caesarean section for fetal distress with a predicted difficult airway, we suggest that it is an acceptable option. [ABSTRACT FROM AUTHOR]
LINGUISTICS, GROUP decision making, DECISION making, ITERATIVE methods (Mathematics), ALGORITHMS
Abstract
This paper proposes an optimal consensus model to derive weights for linguistic preference relations (LPRs). Two indexes, an individual-to-group consensus index ( ICI) and a collective consensus index ( CCI), are introduced. An iterative algorithm is presented to describe the consensus reaching process. By changing the weights and modifying a pair of individuals' comparison judgments-which have largest deviation value to the group judgments-the consensus reaching process can terminate, while both ICI and CCI are controlled with predefined thresholds. The algorithm aims to preserve the decision makers' original information as much as possible. The model and algorithm are then extended to handle the uncertain additive LPRs. Finally, two examples are given to show the effectiveness of the proposed methods. [ABSTRACT FROM AUTHOR]
Gaudencio, Lucia Maria A. L., de Oliveira, Rui, and Curi, Wilson F.
Subjects
NATURAL gas in submerged lands, PETROLEUM industry, SUSTAINABILITY, PRODUCTION increases, BASE oils
Abstract
The increasing production of oil and gas in the marine environment and the growing participation of companies of different nationalities and sizes require the use of tools to support the sustainable management of offshore oil and gas production units. This paper presents the results of the application of a sustainability indicator system, developed from the identification of the economic, environmental, social, and operational impacts of the activities of these production units. The sustainability performances of 3 oil and gas production units operating in the Brazilian marine environment were compared to the performance of one considered ideal, through the application of the PROMETHEE II and ordinal COPELAND multicriteria methods. The indicator system applied favored the analysis of the sustainability management of offshore oil and gas production activity in a multidimensional approach, considering the points of view of experts from various areas of knowledge, and proved to be a reliable tool to support the sustainable management of these offshore production units. [ABSTRACT FROM AUTHOR]
The formal mathematical structure for decision making under uncertainty was first expressed in Savage's axioms over 60 years ago. But while the underlying normative concepts for decision making under uncertainty remain constant, the practice of applying these concepts in real‐world settings, as conducted by decision analysis (DA) specialists working with agencies and interested parties, has seen a major transformation in recent decades. The purpose of this article is to provide perspectives that characterize and interpret how DA practice for societal risk management questions has grown and is being transformed over the last 40 years. It addresses a series of themes for parsing changes in how DA has evolved toward more flexible approaches, moving beyond strict theoretical assumptions and constrained settings, and addresses multiple interested parties to provide insights rather than a single correct answer. The article clarifies the path from the initial DA formulation as a set of normative axioms, through gradual change into what is now the most flexible and least restrictive form of policy analysis. The article shows how the practice of DA for societal risks has become more attuned to a wide array of interests and perspectives, more behaviorally informed, more creative, and more informative for governance process. It addresses the following themes: the evolution in the basic orientation of DA, the increasingly important role of stakeholders in DA practice, the importance and value of key problem‐structuring techniques, and evolution in approaches for eliciting values and technical judgments. [ABSTRACT FROM AUTHOR]
Rose, Lucy E., Hemming, Victoria, Hanea, Anca M., Wintle, Brendan A., and Chee, Yung En
Subjects
SPECIES distribution, AQUATIC exercises, BIODIVERSITY conservation, WETLAND management, FORECASTING, DECISION making
Abstract
Effective biodiversity conservation requires robust and transparent prioritization of management actions. However, this is often hampered by a lack of spatially‐explicit data on habitat variables and empirical data on the effect of management actions. Although approaches exist that integrate structured expert elicitation (SEE) with species distribution models (SDMs) to encode species responses across habitat gradients, difficulties remain in predicting management outcomes under different settings, at a region‐wide scale when key habitat covariates are not spatially explicit. Therefore, we developed an approach to integrate SDMs with SEE to capture expert understanding of likely outcomes of management actions for individual frog species, and use this to spatially predict the effect of management actions. We demonstrate our approach across approximately 4000 wetlands in greater Melbourne, Victoria, Australia. As a measure of management effectiveness, we used the change in predicted probability of occurrence of seven frog species at wetlands 10 years after conservation actions are implemented (or not implemented). Management effect was elicited from experts under six scenarios. Individual expert estimates were aggregated using generalized linear models that were then used to spatially predict expected management effects, and a measure of uncertainty in the prediction, at all wetlands. Predicted management effect was strongly influenced by species initial probability of occurrence, with enhancing aquatic and surrounding vegetation an effective action for most species. We discuss practical challenges and recommend solutions in the integration of SDMs and SEE for the spatial prediction of management effect. [ABSTRACT FROM AUTHOR]
Hilliard, Holly, Parnell, Gregory S., and Pohl, Edward A.
Subjects
NUCLEAR counters, DECISION making, MONTE Carlo method
Abstract
The Domestic Nuclear Detection Office (DNDO) of the Department of Homeland Security was created to increase the United States' ability to detect radiological and nuclear (RN) material that could be obtained and then used by terrorists. The office coordinates the Global Nuclear Detection Architecture (GNDA), an international and interagency strategy for detecting, analyzing, and reporting of RN materials outside of regulatory control. In 2012, the Government Accountability Office expressed concern about the prioritization of GNDA resources as well as the documentation of GNDA improvements over time. As a result, the DNDO asked the National Research Council (NRC) for advice on how to develop performance measures and metrics to quantitatively assess the GNDA's effectiveness. The result of the NRC study was a report titled "Performance Metrics for the Global Nuclear Detection Architecture." In the report, the committee created a notional strategic planning framework for evaluating the performance of the GNDA. Using the data from the public report, multiobjective decision analysis techniques, and notional data from our research, this paper expands the NRC framework to a complete value model and demonstrates that it is possible to evaluate the potential performance of the GNDA over time and use the model to evaluate the cost effectiveness of potential improvements. [ABSTRACT FROM AUTHOR]
Environmental decision-making issues in the Atchafalaya River Basin (ARB), Louisiana require innovative approaches that combine scientific understanding and local stakeholder values. Management of the ARB has evolved from strong federal control to establish the ARB as a primary floodway of the Mississippi River and Tributaries Project to a state and federal collaboration to accommodate fish and wildlife resource promotion, recreational opportunities, and economic development. The management policy has expanded to include a growing number of stakeholders, but the decision-making process has not kept pace. Current conflicts among many local stakeholder groups, due in part to their lack of involvement in the decision-making process, impede restoration efforts. The absence of a long-term collective vision for the ARB by both local stakeholder groups and management agencies further confounds these efforts. This paper proposes a process to apply a structured decision-making framework, a values-based approach that explicitly defines objectives, to promote stakeholder-driven restoration efforts in the ARB and to better prepare for and manage long-term environmental issues. The goals of this approach are: (1) to create a process founded on stakeholder values and supported by rigorous scientific assessment to meet management agency mandates and (2) to establish a transparent process for restoration planning in the ARB that incorporates current and future non-governmental stakeholders into the decision-making process. Similar frameworks have been successful in other river basins; we feel the structure of current restoration efforts in the ARB is well-suited to adopt a values-focused management framework. [ABSTRACT FROM AUTHOR]
Feral horse (Equus caballus) population management is a challenging problem around the world because populations often exhibit density‐independent growth, can exert negative ecological effects on ecosystems, and require great cost to be managed. However, strong value‐based connections between people and horses cause contention around management decisions. To help make informed decisions, natural resource managers might benefit from more detailed understanding of how horse management alternatives, including combinations of removals and fertility control methods, could achieve objectives of sustainable, multiple‐use ecosystems while minimizing overall horse handling and fiscal costs. Here, we describe a modeling tool that simulates horse management alternatives and estimates trade‐offs in predicted metrics related to population size, animal handling, and direct costs of management. The model considers six management actions for populations (removals for adoption or long‐term holding; fertility control treatment with three vaccines, intrauterine devices, and mare sterilization), used alone or in combination. We simulated 19 alternative management scenarios at 2‐, 3‐, and 4‐year management return intervals and identified efficiency frontiers among alternatives for trade‐offs between predicted population size and six management metrics. Our analysis identified multiple alternatives that could maintain populations within target population size ranges, but some alternatives (e.g., removal and mare sterilization, removal and GonaCon treatment) performed better at minimizing overall animal handling requirements and management costs. Cost savings increased under alternatives with more effective, longer lasting fertility control techniques over longer management intervals compared with alternatives with less‐effective, shorter lasting fertility control techniques. We built a user‐friendly website application, PopEquus, that decision makers and interested individuals can use to simulate management alternatives and evaluate trade‐offs among management and cost metrics. Our results and website application provide quantitative trade‐off tools for horse population management decisions and can help support value‐based management decisions for wild or feral horse populations and ecosystems at local and regional scales around the world. [ABSTRACT FROM AUTHOR]
Djulbegovic, Benjamin, Hozo, Iztok, Mayrhofer, Thomas, Ende, Jef, and Guyatt, Gordon
Subjects
DECISION making, DISEASES, MATHEMATICAL models, PROGNOSIS, EVIDENCE-based medicine, DECISION making in clinical medicine, THEORY, SYMPTOMS, TREATMENT effectiveness
Abstract
Background: The threshold model represents one of the most significant advances in the field of medical decision‐making, yet it often does not apply to the most common class of clinical problems, which include health outcomes as a part of definition of disease. In addition, the original threshold model did not take a decision‐maker's values and preferences explicitly into account. Methods: We reformulated the threshold model by (1) applying it to those clinical scenarios, which define disease according to outcomes that treatment is designed to affect, (2) taking into account a decision‐maker's values. Results: We showed that when outcomes (eg, morbidity) are integral part of definition of disease, the classic threshold model does not apply (as this leads to double counting of outcomes in the probabilities and utilities branches of the model). To avoid double counting, the model can be appropriately analysed by assuming diagnosis is certain (P = 1). This results in deriving a different threshold—the threshold for outcome of disease (Mt) instead of threshold for probability of disease (Pt) above which benefits of treatment outweigh its harms. We found that Mt ≤ Pt, which may explain differences between normative models and actual behaviour in practice. When a decision‐maker values outcomes related to benefit and harms differently, the new threshold model generates decision thresholds that could be descriptively more accurate. Conclusions: Calculation of the threshold depends on careful disease versus utility definitions and a decision‐maker's values and preferences. [ABSTRACT FROM AUTHOR]
Dealing with weather extremes is a major challenge for farmers and often comes at high costs for public budgets. Therefore, we investigate the influence of specific simplified decision rules, so‐called heuristics, on farmers' willingness to pay (WTP) for protecting themselves against low‐probability and high‐consequence weather shocks. To this end, we conducted a framed field experiment with 237 farmers in Germany, using incentivized lottery‐based multiple price lists. We explored the effects of different heuristics within the prospect theory framework. Our results indicate that, on average, farmers exhibit risk‐loving behavior towards monetary losses, leading to a low WTP for risk mitigation. The results also suggest that the imitation heuristic, shock experience heuristics, and the threshold of concern heuristic influence farmers' WTP. Farmers specifically imitate successful farmers when these are risk‐loving. The lack of personal experience with low‐probability events induces farmers to assign less weight to low‐probability shocks, which lowers their WTP. Farmers also systematically assign less weight to low‐probability shocks that they consider "too rare to be concerned about." Accounting for the use of these heuristics can help design improved risk management instruments and policies. [ABSTRACT FROM AUTHOR]
Siebert, Johannes Ulrich, Becker, Maxi, and Oeser, Nadine
Subjects
HIGH school students, SELF-efficacy, DECISION making
Abstract
At the end of high school, teenagers must deal with the first life‐changing decision of determining what to do after graduation. For these decisions, adolescents need to be able to make good choices. However, most schools have not yet implemented decision trainings into their curricula. A new intervention called "KLUGentscheiden!" was developed to train complex decision‐making in high school students to close this gap. The intervention targets three key components of good decision‐making: envisioning one's objectives, identifying relevant alternatives, and comparing the identified alternatives by a weighted evaluation. We assumed that successfully training those decision‐analytical steps should enhance self‐perceived proactive decision‐making skills. In addition, the training should also enhance self‐assessed career choice self‐efficacy. The intervention was evaluated in a pseudorandomized control study including 193 high school students. Compared to a control group, the intervention group significantly increased proactive decision‐making skills and career choice self‐efficacy. Although different long‐term evaluations are still pending, the KLUGentscheiden! intervention provides an important tool to train complex decision‐making in high‐school students. It also has the potential to apply to other career choices of young individuals, such as choosing majors, a final thesis, a job, or a field of work. [ABSTRACT FROM AUTHOR]
This black box study assessed the performance of forensic firearms examiners in the United States. It involved three different types of firearms and 173 volunteers who performed a total of 8640 comparisons of both bullets and cartridge cases. The overall false‐positive error rate was estimated as 0.656% and 0.933% for bullets and cartridge cases, respectively, while the rate of false negatives was estimated as 2.87% and 1.87% for bullets and cartridge cases, respectively. The majority of errors were made by a limited number of examiners. Because chi‐square tests of independence strongly suggest that error probabilities are not the same for each examiner, these are maximum‐likelihood estimates based on the beta‐binomial probability model and do not depend on an assumption of equal examiner‐specific error rates. Corresponding 95% confidence intervals are (0.305%, 1.42%) and (0.548%, 1.57%) for false positives for bullets and cartridge cases, respectively, and (1.89%, 4.26%) and (1.16%, 2.99%) for false negatives for bullets and cartridge cases, respectively. The results of this study are consistent with prior studies, despite its comprehensive design and challenging specimens. [ABSTRACT FROM AUTHOR]
An emerging risk is characterized by scant published data, rapidly changing information, and an absence of existing models that can be directly used for prediction. Analysis may be further complicated by quickly evolving decision‐maker priorities and the potential need to make decisions quickly as new information comes available. To provide a forum to discuss these challenges, a virtual conference, "Decision Making for Emerging Risks," was held on June 22–23, 2021, sponsored jointly by the Decision Analysis Society of the Institute for Operations Research and the Management Sciences and the Decision Analysis and Risk specialty group in the Society for Risk Analysis. Speakers reflected on the work to support decision‐makers related to the COVID‐19 pandemic as well as experiences in emerging risks across domains from cybersecurity, infrastructure, transportation, energy, food safety, national security, and climate change. Here, we distill the key findings to propose a set of best practice principles for a "decision‐first" approach for emerging risks. These discussions underscore the importance of scoping the decision context and the shared responsibility for the development and implementation of the analysis between the analyst and the decision‐maker when the context can evolve rapidly. Emerging risks may also favor simpler analytical approaches that increase transparency, ease of explanation, and ability to conduct new analyses quickly. Continued dialogue by the decision and risk analysis communities on the use and development of models for emerging risks will enhance the credibility and usefulness of these approaches. [ABSTRACT FROM AUTHOR]
Robinson, Alexander, Keller, L. Robin, and del Campo, Cristina
Subjects
GROUP problem solving, FALSE positive error, ANALYTICAL skills, COVID-19 pandemic, TEACHING methods, DECISION making
Abstract
COVID‐19 pandemic policies requiring disease testing provide a rich context to build insights on true positives versus false positives. Our main contribution to the pedagogy of data analytics and statistics is to propose a method for teaching updating of probabilities using Bayes' rule reasoning to build understanding that true positives and false positives depend on the prior probability. Our instructional approach has three parts. First, we show how to construct and interpret raw frequency data tables, instead of using probabilities. Second, we use dynamic visual displays to develop insights and help overcome calculation avoidance or errors. Third, we look at graphs of positive predictive values and negative predictive values for different priors. The learning activities we use include lectures, in‐class discussions and exercises, breakout group problem solving sessions, and homework. Our research offers teaching methods to help students understand that the veracity of test results depends on the prior probability as well as helps students develop fundamental skills in understanding probabilistic uncertainty alongside higher‐level analytical and evaluative skills. Beyond learning to update the probability of having the disease given a positive test result, our material covers naïve estimates of the positive predictive value, the common mistake of ignoring the disease's base rate, debating the relative harm from a false positive versus a false negative, and creating a new disease test. [ABSTRACT FROM AUTHOR]
Pepin, Kim M., Davis, Amy J., Epanchin‐Niell, Rebecca S., Gormley, Andrew M., Moore, Joslin L., Smyser, Timothy J., Shaffer, H. Bradley, Kendall, William L., Shea, Katriona, Runge, Michael C., and McKee, Sophie
Dispersal drives invasion dynamics of nonnative species and pathogens. Applying knowledge of dispersal to optimize the management of invasions can mean the difference between a failed and a successful control program and dramatically improve the return on investment of control efforts. A common approach to identifying optimal management solutions for invasions is to optimize dynamic spatial models that incorporate dispersal. Optimizing these spatial models can be very challenging because the interaction of time, space, and uncertainty rapidly amplifies the number of dimensions being considered. Addressing such problems requires advances in and the integration of techniques from multiple fields, including ecology, decision analysis, bioeconomics, natural resource management, and optimization. By synthesizing recent advances from these diverse fields, we provide a workflow for applying ecological theory to advance optimal management science and highlight priorities for optimizing the control of invasions. One of the striking gaps we identify is the extremely limited consideration of dispersal uncertainty in optimal management frameworks, even though dispersal estimates are highly uncertain and greatly influence invasion outcomes. In addition, optimization frameworks rarely consider multiple types of uncertainty (we describe five major types) and their interrelationships. Thus, feedbacks from management or other sources that could magnify uncertainty in dispersal are rarely considered. Incorporating uncertainty is crucial for improving transparency in decision risks and identifying optimal management strategies. We discuss gaps and solutions to the challenges of optimization using dynamic spatial models to increase the practical application of these important tools and improve the consistency and robustness of management recommendations for invasions. [ABSTRACT FROM AUTHOR]
Resource managers frequently are tasked with mitigating or reversing adverse effects of invasive species through management policies and actions. In Lake Superior, of the Laurentian Great Lakes, invasive sea lamprey populations are suppressed to protect valuable fish stocks. However, the relationship between choice of long‐term control strategy and the future chance of achieving the suppression target is unclear.Using a 60+ year time series of suppression effort and monitoring data from 50 assessment sites located on Lake Superior tributaries, we developed a Bayesian state‐space model to forecast the probability of suppressing lamprey below the suppression target.With annual application of lampricide (i.e. lamprey‐specific pesticide) at historical mean levels, we forecasted a 15% chance of achieving the Lake Superior sea lamprey suppression target in 2040.Increasing lampricide effort and/or supplementing lampricide control with age‐1 recruitment reduction increased suppression chance. Annual application of the maximum historical lampricide effort resulted in a 50% predicted chance of achieving the target, annual application of the mean historic lampricide effort plus a 40% reduction in recruitment resulted in a 54% chance, and the maximum amount of effort considered (maximum historic lampricide and 60% reduction in recruitment) resulted in a 94% chance.Policy implications. We developed a simulation model from a robust, long‐term monitoring dataset that improves understanding of why long‐term sea lamprey suppression objectives have been difficult to achieve in Lake Superior. Furthermore, the model provides a means to gauge efficacy of sea lamprey control policy and action scenarios based on forecasted chance of achieving the suppression target. Creating processes for iteratively refining our forecasting model with stakeholder and technical‐expert input and integration with a decision analysis framework could strengthen the link between ecological knowledge obtained from long‐term monitoring and invasive sea lamprey management. [ABSTRACT FROM AUTHOR]
Lawson, Abigail J., Kalasz, Kevin, Runge, Michael C., Schwarzer, Amy C., Stantial, Michelle L., Woodrey, Mark, and Lyons, James E.
Subjects
BLACK people, NATURAL resources management, CONCEPTUAL models
Abstract
Natural resource management decisions are often made in the face of uncertainty. The question for the decision maker is whether the uncertainty is an impediment to the decision and, if so, whether it is worth reducing uncertainty before or while implementing actions. Value of information (VoI) methods are decision analytical tools to evaluate the benefit to the decision maker of resolving uncertainty. These methods, however, require quantitative predictions of the outcomes as a function of management alternatives and uncertainty, in which predictions which may not be available at early stages of decision prototyping. Here we describe the first participatory application of a new qualitative approach to VoI in an adaptive management workshop for Atlantic Coast eastern black rail populations. The eastern black rail is a small, cryptic marsh bird that was recently listed as federally threatened, with extremely little demographic data available. Workshop participants developed conceptual models and nine hypotheses related to the effects of habitat management alternatives on black rail demography. Here, we describe the qualitative VoI framework, how it was implemented in the workshop, and the analysis outcomes, and describe the benefits of qualitative VoI in the context of adaptive management and co‐production of conservation science. [ABSTRACT FROM AUTHOR]
Kivunike, Florence Nameere, Ekenberg, Love, Danielson, Mats, and Tusubira, F. F.
Abstract
Abstract: Using a case from the healthcare delivery sector, we demonstrate how a structured evaluation approach can facilitate the measurement of actual ICT contributions in various contexts. Typically, such are intricate due to the complexities inherent in the environments, making it difficult to evaluate the relationship between ICT and the benefits it intends to achieve to a reasonable degree. The approach suggested in this paper tries to partly remedy some of these complications, by facilitating qualitative data elicitations, aggregation, analysis and evaluation. To make this computationally meaningful, a decision support tool for handling numerically imprecise information is used for the data analysis and evaluation details. The results of this indicate that such an approach makes at least some meaningful input for practitioners and policymakers. In comparison to the qualitative in‐depth approaches this approach facilitates a one‐point in time assessment, which is less resource intensive, but provides prompt and substantial insight on the development performance of ICT4D initiatives. A similar approach would also be applicable to different sectors, and can utilize a broader scope of criteria, as well as incorporate views from several categories of stakeholders. [ABSTRACT FROM AUTHOR]
Kuula, Laura S.M., Backman, Janne T., and Blom, Marja L.
Subjects
MEDICAL care costs, DRUG side effects, MEDICAL prescriptions, CLOSTRIDIOIDES difficile, PHARMACEUTICAL services insurance, AORTIC rupture
Abstract
The aim of this study was to estimate healthcare costs and mortality associated with serious fluoroquinolone‐related adverse reactions in Finland from 2008 to 2019. Serious adverse reaction types were identified from the Finnish Pharmaceutical Insurance Pool's pharmaceutical injury claims and the Finnish Medicines Agency's Adverse Reaction Register. A decision tree model was built to predict costs and mortality associated with serious adverse drug reactions (ADR). Severe clostridioides difficile infections, severe cutaneous adverse reactions, tendon ruptures, aortic ruptures, and liver injuries were included as serious adverse drug reactions in the model. Direct healthcare costs of a serious ADR were based on the number of reimbursed fluoroquinolone prescriptions from the Social Insurance Institution of Finland's database. Sensitivity analyses were conducted to address parameter uncertainty. A total of 1 831 537 fluoroquinolone prescriptions were filled between 2008 and 2019 in Finland, with prescription numbers declining 40% in recent years. Serious ADRs associated with fluoroquinolones lead to estimated direct healthcare costs of 501 938 402 €, including 11 405 ADRs and 3,884 deaths between 2008 and 2019. The average mortality risk associated with the use of fluoroquinolones was 0.21%. Severe clostridioides difficile infections were the most frequent, fatal, and costly serious ADRs associated with the use of fluoroquinolones. Although fluoroquinolones continue to be generally well‐tolerated antimicrobials, serious adverse reactions cause long‐term impairment to patients and high healthcare costs. Therefore, the risks and benefits should be weighed carefully in antibiotic prescription policies, as well as with individual patients. [ABSTRACT FROM AUTHOR]
Hemming, Victoria, Camaclang, Abbey E., Adams, Megan S., Burgman, Mark, Carbeck, Katherine, Carwardine, Josie, Chadès, Iadine, Chalifour, Lia, Converse, Sarah J., Davidson, Lindsay N. K., Garrard, Georgia E., Finn, Riley, Fleri, Jesse R., Huard, Jacqueline, Mayfield, Helen J., Madden, Eve McDonald, Naujokaitis‐Lewis, Ilona, Possingham, Hugh P., Rumpff, Libby, and Runge, Michael C.
Subjects
ADAPTIVE natural resource management, ECOSYSTEM services, DECISION theory, CONSERVATION of natural resources, DECISION making, SOCIAL scientists, SCIENTIFIC literature, BIODIVERSITY
Abstract
They can be vitally important to the decision, and it is necessary to consider them alongside more easily quantified objectives, such as species abundance and cost.
Scarce resources
Resources (e.g., time, staff capacity, money, space) available for conservation are often limited, requiring consideration of how to best allocate resources to achieve objectives.
Complex alternatives
In complex ecological decisions, the range of possible alternative actions is often very large and multifaceted.
Irreversible consequences and tipping points
Conservation decisions sometimes involve tipping points between different system states or irreversible outcomes to be avoided. In our experience, decision analysis and the broader field of decision science provide a useful lens through which to address conservation problems, a greater awareness of the role of values and science in these decisions, and a process for identifying alternatives that are more likely to achieve multiple important values in a timely manner. APPLYING DECISION ANALYSIS Improving the chance of good outcomes for difficult conservation decisions (challenges listed in Table 1) arises from first knowing how to think through decisions with the foundational concepts of decision theory (Keeney, 2004; Raiffa, 2002; Smith, 2020a). Very few, typically the most complex decisions (~50 [0.5%]), will require a full decision analysis and would benefit from more time and resources gl Decisions for questions like these are difficult because they involve multiple value judgments, considerable uncertainty, potentially irreversible consequences, and other challenging characteristics common to conservation decisions (Table 1). [Extracted from the article]
Alvarez, Gustavo B., de Almeida, Rafael G., Hernández, Cecilia T., and de Sousa, Patrícia A. P.
Subjects
ANALYTIC hierarchy process, MATHEMATICAL analysis, LINEAR systems, LINEAR equations, DECISION making, SENSITIVITY analysis
Abstract
The Analytic Hierarchy Process (AHP) is a decision making method, which has as its greatest criticism the rank reversal effect. Here a new mathematical analysis of this method is performed, and three new results are highlighted. First, the method is formulated as a linear system of equations, where it is possible to assign a geometric interpretation, determine the number of possible solutions, and perform an sensitivity analysis based on the condition number of the matrix. Second, the causes of rank reversal can be encompassed by two mathematical aspects related to the properties of the matrix of AHP: high condition number and deficient rank. When the matrix is deficient rank, it is possible to obtain a condensed formulation of the AHP with a new full rank matrix. This guarantees greater stability to the method. Third, some mathematical results can be used as a robustness test for the matrix of AHP. [ABSTRACT FROM AUTHOR]
Moore, Jennifer F., Udell, Bradley J., Martin, Julien, Turikunkiko, Ezechiel, and Masozera, Michel K.
Subjects
POACHING, LAW enforcement, INTEGER programming, DECISION making, MATHEMATICAL optimization, LINEAR programming
Abstract
Poaching is a global problem causing the decline of species worldwide. Optimizing the efficiency of ranger patrols to deter poaching activity at the lowest possible cost is crucial for protecting species with limited resources. We applied decision analysis and spatial optimization algorithms to allocate efforts of ranger patrols throughout a national park. Our objective was to mitigate poaching activity at or below management risk targets for the lowest monetary cost. We examined this trade‐off by constructing a Pareto efficiency frontier using integer linear programming. We used data from a ranger‐based monitoring program in Nyungwe National Park, Rwanda. Our measure of poaching risk is based on dynamic occupancy models that account for imperfect detection of poaching activities. We found that in order to achieve a 5% reduction in poaching risk, 622 ranger patrol events (each corresponding to patrolling 1‐km2 sites) were needed within a year at a cost of US$49,760. In order to attain a 60% reduction in poaching risk, 15,560 patrol events were needed at a cost of US$1,244,800. We evaluated the trade‐off between patrol cost and poaching risk based on our model by constructing a Pareto efficiency frontier and park managers found the solution for a 50% risk reduction to be a practical trade‐off based on funding constraints (comparable to recent years) and the diminishing returns between risk mitigation and cost. This expected reduction in risk required 8,558 patrol events per year at a cost of US$684,640. Our results suggest that optimal solutions could increase efficiency compared to the actual effort allocations from 2006 to 2016 in Nyungwe National Park (e.g., risk reductions of ~30% under recent budgets compared to ~50% reduction in risk under the optimal strategy). The modeling framework in this study took into account imperfect detection of poaching risk as well as the directional and conditional nature of ranger patrol events given the spatial adjacency relationships of neighboring sites and access points. Our analyses can help to improve the efficiency of ranger patrols, and the modeling framework can be broadly applied to other spatial conservation planning problems with conditional, multilevel, site selection. [ABSTRACT FROM AUTHOR]
Forner, David, Hoit, Graeme, Noel, Christopher W., Eskander, Antoine, de Almeida, John R., Rigby, Matthew H., and Naimark, David
Abstract
Decision making in health care is complex, and substantial uncertainty can be involved. Structured, systematic approaches to the integration of available evidence, assessment of uncertainty, and determination of choice are of significant benefit in an era of "value-based care." This is especially true for otolaryngology-head and neck surgery, where technological advancements are frequent and applicable to an array of subspecialties. Decision analysis aims to achieve these goals through various modeling techniques, including (1) decision trees, (2) Markov process, (3) microsimulation, and (4) discrete event simulation. While decision models have been used for decades, many clinicians and researchers continue to have difficulty deciphering them. In this review, we present an overview of various decision analysis modeling techniques, their purposes, how they can be interpreted, and commonly used syntax to promote understanding and use of these approaches. Throughout, we provide a sample research question to facilitate discussion of the advantages and disadvantages of each technique. [ABSTRACT FROM AUTHOR]
Camaclang, Abbey E., Currie, Jessica, Giles, Emily, Forbes, Graham J., Edge, Christopher B., Monk, Wendy A., Nocera, Joseph J., Stewart‐Robertson, Graeme, Browne, Constance, O'Malley, Zoe G., Snider, James, and Martin, Tara G.
The need to manage threats to biodiversity, and to do so cost‐effectively, is urgent. Cross‐realm conservation management is recognized as a cost‐effective approach, but it requires collaboration between agencies and jurisdictions, and local knowledge of anthropogenic threats to biodiversity. With its emphasis on stakeholder engagement and use of structured expert elicitation, Priority Threat Management (PTM) facilitates rapid, cross‐realm planning at the regional scale. We used PTM to identify cost‐effective management strategies with the aim of securing nine ecological groups, comprised of 45 species and one ecological community of conservation concern, across terrestrial and freshwater realms within the Wolastoq|Saint John River watershed in Canada. Under business‐as‐usual, four of nine groups are expected to have >50% probability of persistence over the next 25 years. Investment of $141 million over 25 years in three management strategies could secure seven groups across both realms with >50% probability of persistence. Achieving higher levels of persistence comes at a cost—securing six groups with >60% probability of persistence requires investing $218 million over 25 years in seven strategies. Through a structured, iterative process, whereby stakeholders cooperate to clarify objectives, devise management strategies, and collate data, PTM can support timely and cost‐effective management across multiple realms. [ABSTRACT FROM AUTHOR]
HEMOPHILIA, PRODUCT costing, DECISION making, COST, PREVENTIVE medicine
Abstract
Introduction: Increased usage of emicizumab in the United States will affect standard half‐life (SHL) and extended half‐life (EHL) products usage and cost. Aim: To model the usage and cost of SHL and EHL products, and emicizumab to treat haemophilia A (HA) in the 13 Western States Region IX haemophilia treatment centres (HTCs.) (California, Nevada, Hawaii and Guam). Methods: We modelled product usage and cost using decision analysis methods. Variables: epidemiology/demographics, treatment and product cost. Data were from the US Western States Region IX, US Centers for Disease Control and Prevention, American Thrombosis and Hemostasis Network and the literature. Results: Prior to EHL products and emicizumab, the usage of SHL products was ~300 million international units (IUs) or 6.8 IUs/capita and a cost of $430 million. With the uptake of EHL and emicizumab, the 2025 estimated usage of factor (SHL and EHL) was 270 million IUs (5.4 IU per capita) and 1,993 grams (40 micrograms/capita) for emicizumab and a cost of $532 million. As the number of HA patients in the region increases by 59%, factor usage increases by 20%, emicizumab usage increases by 26%, and cost increases to $650 million. Conclusion: The entrance of emicizumab into the market may radically change the use of SHL and EHL products. Our model suggests that emicizumab use will likely increase total product costs. While our estimates are most useful for the United States, the effect of emicizumab on factor use will likely be similar in other parts of the world. [ABSTRACT FROM AUTHOR]
RISK assessment, PROCESS control systems, DECISION making, ARTIFICIAL intelligence, INTELLIGENT agents
Abstract
Decision analysis and risk analysis have grown up around a set of organizing questions: what might go wrong, how likely is it to do so, how bad might the consequences be, what should be done to maximize expected utility and minimize expected loss or regret, and how large are the remaining risks? In probabilistic causal models capable of representing unpredictable and novel events, probabilities for what will happen, and even what is possible, cannot necessarily be determined in advance. Standard decision and risk analysis questions become inherently unanswerable ("undecidable") for realistically complex causal systems with "open‐world" uncertainties about what exists, what can happen, what other agents know, and how they will act. Recent artificial intelligence (AI) techniques enable agents (e.g., robots, drone swarms, and automatic controllers) to learn, plan, and act effectively despite open‐world uncertainties in a host of practical applications, from robotics and autonomous vehicles to industrial engineering, transportation and logistics automation, and industrial process control. This article offers an AI/machine learning perspective on recent ideas for making decision and risk analysis (even) more useful. It reviews undecidability results and recent principles and methods for enabling intelligent agents to learn what works and how to complete useful tasks, adjust plans as needed, and achieve multiple goals safely and reasonably efficiently when possible, despite open‐world uncertainties and unpredictable events. In the near future, these principles could contribute to the formulation and effective implementation of more effective plans and policies in business, regulation, and public policy, as well as in engineering, disaster management, and military and civil defense operations. They can extend traditional decision and risk analysis to deal more successfully with open‐world novelty and unpredictable events in large‐scale real‐world planning, policymaking, and risk management. [ABSTRACT FROM AUTHOR]
OVERTREATMENT of cancer, CERVICAL cancer, EARLY detection of cancer, INAPPROPRIATE prescribing (Medicine), GENITAL warts, MARKOV processes
Abstract
A general concern exists that cervical cancer screening using human papillomavirus (HPV) testing may lead to considerable overtreatment. We evaluated the trade‐off between benefits and overtreatment among different screening strategies differing by primary tests (cytology, p16/Ki‐67, HPV alone or in combinations), interval, age and diagnostic follow‐up algorithms. A Markov state‐transition model calibrated to the Austrian epidemiological context was used to predict cervical cancer cases, deaths, overtreatments and incremental harm–benefit ratios (IHBR) for each strategy. When considering the same screening interval, HPV‐based screening strategies were more effective compared to cytology or p16/Ki‐67 testing (e.g., relative reduction in cervical cancer with biennial screening: 67.7% for HPV + Pap cotesting, 57.3% for cytology and 65.5% for p16/Ki‐67), but were associated with increased overtreatment (e.g., 19.8% more conizations with biennial HPV + Papcotesting vs. biennial cytology). The IHBRs measured in unnecessary conizations per additional prevented cancer‐related death were 31 (quinquennial Pap + p16/Ki‐67‐triage), 49 (triennial Pap + p16/Ki‐67‐triage), 58 (triennial HPV + Pap cotesting), 66 (biennial HPV + Pap cotesting), 189 (annual Pap + p16/Ki‐67‐triage) and 401 (annual p16/Ki‐67 testing alone). The IHBRs increased significantly with increasing screening adherence rates and slightly with lower age at screening initiation, with a reduction in HPV incidence or with lower Pap‐test sensitivity. Depending on the accepted IHBR threshold, biennial or triennial HPV‐based screening in women as of age 30 and biennial cytology in younger women may be considered in opportunistic screening settings with low or moderate adherence such as in Austria. In organized settings with high screening adherence and in postvaccination settings with lower HPV prevalence, the interval may be prolonged. What's new? While human papillomavirus (HPV)‐based testing is effective for cervical cancer detection, trade‐offs between benefits and harms, particularly overtreatment associated with HPV testing, remain poorly defined. This study, using model simulations, suggests that over the same screening interval, HPV testing is more effective than conventional Pap cytology and p16/Ki‐67 staining but carries a risk of overtreatment, particularly in settings with short screening intervals. Relative to annual cytology, HPV and Pap cotesting results in similar benefit but fewer unnecessary treatments. The findings indicate that harm‐benefit ratios for different screening methods vary depending on screening setting, interval, adherence, and HPV‐vaccination status. [ABSTRACT FROM AUTHOR]
HEALTH policy, RESEARCH, LIFE expectancy, RESEARCH methodology, MEDICAL cooperation, EVALUATION research, NATIONAL health services, COMPARATIVE studies, COST effectiveness, RESEARCH funding, HEALTH equity
Abstract
Alternative strategies can reduce road vehicle emissions, with differential effects on exposure across population groups. We compare alternative strategies in West Yorkshire using a framework for economic evaluation that considers multiple perspectives and that takes account of the distribution of health outcomes. Exposure to pollutants by area is converted, via dose response relationships, into disease averted. Health benefits and National Health Service costs from diseases are estimated conditional on population demographics and index of multiple deprivation. The net health benefits from alternative strategies are expressed as distributions of quality-adjusted life expectancy (QALE), which are compared using dominance criteria and societal aversion to health inequality. Net production is estimated from intervention costs and the effects of health improvement on production and consumption. Social care outcomes are estimated from health improvement among care recipients and changes in care expenditure. A switch to less polluting private vehicles is dominant in terms of the distribution of QALE and social care outcomes but not consumption. Inclusion of health inequality aversion alters the rank order compared with prioritisation on health maximisation. The results were sensitive to the magnitude of health opportunity costs, the level of inequality aversion, and the proportion of intervention cost that generates health opportunity cost. [ABSTRACT FROM AUTHOR]