24 results on '"Cox,, Louis Anthony (Tony)"'
Search Results
2. Overcoming Learning Aversion in Evaluating and Managing Uncertain Risks.
- Author
-
Cox LA Jr
- Subjects
- Humans, Avoidance Learning, Risk Management, Uncertainty
- Abstract
Decision biases can distort cost-benefit evaluations of uncertain risks, leading to risk management policy decisions with predictably high retrospective regret. We argue that well-documented decision biases encourage learning aversion, or predictably suboptimal learning and premature decision making in the face of high uncertainty about the costs, risks, and benefits of proposed changes. Biases such as narrow framing, overconfidence, confirmation bias, optimism bias, ambiguity aversion, and hyperbolic discounting of the immediate costs and delayed benefits of learning, contribute to deficient individual and group learning, avoidance of information seeking, underestimation of the value of further information, and hence needlessly inaccurate risk-cost-benefit estimates and suboptimal risk management decisions. In practice, such biases can create predictable regret in selection of potential risk-reducing regulations. Low-regret learning strategies based on computational reinforcement learning models can potentially overcome some of these suboptimal decision processes by replacing aversion to uncertain probabilities with actions calculated to balance exploration (deliberate experimentation and uncertainty reduction) and exploitation (taking actions to maximize the sum of expected immediate reward, expected discounted future reward, and value of information). We discuss the proposed framework for understanding and overcoming learning aversion and for implementing low-regret learning strategies using regulation of air pollutants with uncertain health effects as an example., (© 2015 Society for Risk Analysis.)
- Published
- 2015
- Full Text
- View/download PDF
3. Quantitative assessment of human MRSA risks from swine.
- Author
-
Cox LA Jr and Popken DA
- Subjects
- Animals, Humans, Staphylococcal Infections microbiology, Swine, Methicillin-Resistant Staphylococcus aureus isolation & purification, Staphylococcal Infections transmission, Zoonoses transmission
- Abstract
The public health community, news media, and members of the general public have expressed significant concern that methicillin-resistant Staphylococcus aureus (MRSA) transmitted from pigs to humans may harm human health. Studies of the prevalence and dynamics of swine-associated (ST398) MRSA have sampled MRSA at discrete points in the presumed causative chain leading from swine to human patients, including sampling bacteria from live pigs, retail meats, farm workers, and hospital patients. Nonzero prevalence is generally interpreted as indicating a potential human health hazard from MRSA infections, but quantitative assessments of resulting risks are not usually provided. This article integrates available data from several sources to construct a conservative (plausible upper bound) probability estimate for the actual human health harm (MRSA infections and fatalities) arising from ST398-MRSA from pigs. The model provides plausible upper bounds of approximately one excess human infection per year among all U.S. pig farm workers, and one human infection per 31 years among the remaining total population of the United States. These results assume the possibility of transmission events not yet observed, so additional data collection may reduce these estimates further., (© 2014 Society for Risk Analysis.)
- Published
- 2014
- Full Text
- View/download PDF
4. Improving causal inferences in risk analysis.
- Author
-
Cox LA Jr
- Subjects
- Models, Statistical, Causality, Risk Assessment
- Abstract
Recent headlines and scientific articles projecting significant human health benefits from changes in exposures too often depend on unvalidated subjective expert judgments and modeling assumptions, especially about the causal interpretation of statistical associations. Some of these assessments are demonstrably biased toward false positives and inflated effects estimates. More objective, data-driven methods of causal analysis are available to risk analysts. These can help to reduce bias and increase the credibility and realism of health effects risk assessments and causal claims. For example, quasi-experimental designs and analysis allow alternative (noncausal) explanations for associations to be tested, and refuted if appropriate. Panel data studies examine empirical relations between changes in hypothesized causes and effects. Intervention and change-point analyses identify effects (e.g., significant changes in health effects time series) and estimate their sizes. Granger causality tests, conditional independence tests, and counterfactual causality models test whether a hypothesized cause helps to predict its presumed effects, and quantify exposure-specific contributions to response rates in differently exposed groups, even in the presence of confounders. Causal graph models let causal mechanistic hypotheses be tested and refined using biomarker data. These methods can potentially revolutionize the study of exposure-induced health effects, helping to overcome pervasive false-positive biases and move the health risk assessment scientific community toward more accurate assessments of the impacts of exposures and interventions on public health., (© 2013 Society for Risk Analysis.)
- Published
- 2013
- Full Text
- View/download PDF
5. Introduction to the special issue: insights and applications from financial risk analysis.
- Author
-
Novosyolov A and Cox LA Jr
- Published
- 2012
- Full Text
- View/download PDF
6. Evaluating and improving risk formulas for allocating limited budgets to expensive risk-reduction opportunities.
- Author
-
Cox LA Jr
- Abstract
Simple risk formulas, such as risk = probability × impact, or risk = exposure × probability × consequence, or risk = threat × vulnerability × consequence, are built into many commercial risk management software products deployed in public and private organizations. These formulas, which we call risk indices, together with risk matrices, "heat maps," and other displays based on them, are widely used in applications such as enterprise risk management (ERM), terrorism risk analysis, and occupational safety. But, how well do they serve to guide allocation of limited risk management resources? This article evaluates and compares different risk indices under simplifying conditions favorable to their use (statistically independent, uniformly distributed values of their components; and noninteracting risk-reduction opportunities). Compared to an optimal (nonindex) approach, simple indices produce inferior resource allocations that for a given cost may reduce risk by as little as 60% of what the optimal decisions would provide, at least in our simple simulations. This article suggests a better risk reduction per unit cost index that achieves 98-100% of the maximum possible risk reduction on these problems for all budget levels except the smallest, which allow very few risks to be addressed. Substantial gains in risk reduction achieved for resources spent can be obtained on our test problems by using this improved index instead of simpler ones that focus only on relative sizes of risk (or of components of risk) in informing risk management priorities and allocating limited risk management resources. This work suggests the need for risk management tools to explicitly consider costs in prioritization activities, particularly in situations where budget restrictions make careful allocation of resources essential for achieving close-to-maximum risk-reduction benefits., (© 2011 Society for Risk Analysis.)
- Published
- 2012
- Full Text
- View/download PDF
7. Clarifying types of uncertainty: when are models accurate, and uncertainties small?
- Author
-
Cox LA Jr
- Subjects
- Risk Management organization & administration, Uncertainty
- Abstract
Professor Aven has recently noted the importance of clarifying the meaning of terms such as "scientific uncertainty" for use in risk management and policy decisions, such as when to trigger application of the precautionary principle. This comment examines some fundamental conceptual challenges for efforts to define "accurate" models and "small" input uncertainties by showing that increasing uncertainty in model inputs may reduce uncertainty in model outputs; that even correct models with "small" input uncertainties need not yield accurate or useful predictions for quantities of interest in risk management (such as the duration of an epidemic); and that accurate predictive models need not be accurate causal models., (© 2011 Society for Risk Analysis.)
- Published
- 2011
- Full Text
- View/download PDF
8. An exposure-response threshold for lung diseases and lung cancer caused by crystalline silica.
- Author
-
Cox LA Jr
- Subjects
- Crystallization, Dose-Response Relationship, Drug, Humans, Lung metabolism, Lung Diseases metabolism, Reactive Nitrogen Species metabolism, Reactive Oxygen Species, Silicon Dioxide chemistry, Tumor Necrosis Factor-alpha metabolism, Lung Diseases chemically induced, Occupational Exposure, Silicon Dioxide toxicity
- Abstract
Whether crystalline silica (CS) exposure increases risk of lung cancer in humans without silicosis, and, if so, whether the exposure-response relation has a threshold, have been much debated. Epidemiological evidence is ambiguous and conflicting. Experimental data show that high levels of CS cause lung cancer in rats, although not in other species, including mice, guinea pigs, or hamsters; but the relevance of such animal data to humans has been uncertain. This article applies recent insights into the toxicology of lung diseases caused by poorly soluble particles (PSPs), and by CS in particular, to model the exposure-response relation between CS and risk of lung pathologies such as chronic inflammation, silicosis, fibrosis, and lung cancer. An inflammatory mode of action is described, having substantial empirical support, in which exposure increases alveolar macrophages and neutrophils in the alveolar epithelium, leading to increased reactive oxygen species (ROS) and nitrogen species (RNS), pro-inflammatory mediators such as TNF-alpha, and eventual damage to lung tissue and epithelial hyperplasia, resulting in fibrosis and increased lung cancer risk among silicotics. This mode of action involves several positive feedback loops. Exposures that increase the gain factors around such loops can create a disease state with elevated levels of ROS, TNF-alpha, TGF-beta, alveolar macrophages, and neutrophils. This mechanism implies a "tipping point" threshold for the exposure-response relation. Applying this new model to epidemiological data, we conclude that current permissible exposure levels, on the order of 0.1 mg/m³, are probably below the threshold for triggering lung diseases in humans., (© 2011 Society for Risk Analysis.)
- Published
- 2011
- Full Text
- View/download PDF
9. How probabilistic risk assessment can mislead terrorism risk analysts.
- Author
-
Brown GG and Cox LA Jr
- Subjects
- Decision Support Techniques, Probability, Risk Assessment, Terrorism
- Abstract
Traditional probabilistic risk assessment (PRA), of the type originally developed for engineered systems, is still proposed for terrorism risk analysis. We show that such PRA applications are unjustified in general. The capacity of terrorists to seek and use information and to actively research different attack options before deciding what to do raises unique features of terrorism risk assessment that are not adequately addressed by conventional PRA for natural and engineered systems-in part because decisions based on such PRA estimates do not adequately hedge against the different probabilities that attackers may eventually act upon. These probabilities may differ from the defender's (even if the defender's experts are thoroughly trained, well calibrated, unbiased probability assessors) because they may be conditioned on different information. We illustrate the fundamental differences between PRA and terrorism risk analysis, and suggest use of robust decision analysis for risk management when attackers may know more about some attack options than we do., (© 2010 Society for Risk Analysis.)
- Published
- 2011
- Full Text
- View/download PDF
10. A causal model of chronic obstructive pulmonary disease (COPD) risk.
- Author
-
Cox LA Jr
- Subjects
- Animals, Apoptosis, Autoimmunity, Disease Progression, Feedback, Physiological, Humans, Macrophages physiology, Mathematical Concepts, Mice, Neutrophils physiology, Oxidants metabolism, Peptide Hydrolases metabolism, Pulmonary Alveoli immunology, Pulmonary Alveoli pathology, Pulmonary Alveoli physiopathology, Pulmonary Disease, Chronic Obstructive immunology, Pulmonary Disease, Chronic Obstructive pathology, Pulmonary Disease, Chronic Obstructive physiopathology, Risk Factors, Smoking adverse effects, Smoking Cessation, Models, Biological, Pulmonary Disease, Chronic Obstructive etiology
- Abstract
Research on the etiology of chronic pulmonary disease (COPD), an irreversible degenerative lung disease affecting 15% to 20% of smokers, has blossomed over the past half-century. Profound new insights have emerged from a combination of in vitro and -omics studies on affected lung cell populations (including cytotoxic CD8(+) T lymphocytes, regulatory CD4(+) helper T cells, dendritic cells, alveolar macrophages and neutrophils, alveolar and bronchiolar epithelial cells, goblet cells, and fibroblasts) and extracellular matrix components (especially, elastin and collagen fibers); in vivo studies on wild-type and genetically engineered mice and other rodents; clinical investigation of cell- and molecular-level changes in asymptomatic smokers and COPD patients; genetic studies of susceptible and rapidly-progressing phenotypes (both human and animal); biomarker studies of enzyme and protein degradation products in induced sputum, bronchiolar lavage, urine, and blood; and epidemiological and clinical investigations of the time course of disease progression. To this rich mix of data, we add a relatively simple in silico computational model that incorporates recent insights into COPD disease causation and progression. Our model explains irreversible degeneration of lung tissue as resulting from a cascade of positive feedback loops: a macrophage inflammation loop, a neutrophil inflammation loop, and an alveolar epithelial cell apoptosis loop. Unrepaired damage results in clinical symptoms. The resulting model illustrates how to simplify and make more understandable the main aspects of the very complex dynamics of COPD initiation and progression, as well as how to predict the effects on risk of interventions that affect specific biological responses., (© 2010 Society for Risk Analysis.)
- Published
- 2011
- Full Text
- View/download PDF
11. Assessing potential human health hazards and benefits from subtherapeutic antibiotics in the United States: tetracyclines as a case study.
- Author
-
Cox LA Jr and Popken DA
- Subjects
- Animal Feed analysis, Animals, Anti-Bacterial Agents therapeutic use, Europe epidemiology, Hazardous Substances, Humans, Politics, Public Policy, Risk, Risk Assessment, Risk Management, Tetracycline, Tetracyclines, United States epidemiology, Anti-Bacterial Agents analysis
- Abstract
Many scientists, activists, regulators, and politicians have expressed urgent concern that using antibiotics in food animals selects for resistant strains of bacteria that harm human health and bring nearer a "postantibiotic era" of multidrug resistant "super-bugs." Proposed political solutions, such as the Preservation of Antibiotics for Medical Treatment Act (PAMTA), would ban entire classes of subtherapeutic antibiotics (STAs) now used for disease prevention and growth promotion in food animals. The proposed bans are not driven by formal quantitative risk assessment (QRA), but by a perceived need for immediate action to prevent potential catastrophe. Similar fears led to STA phase-outs in Europe a decade ago. However, QRA and empirical data indicate that continued use of STAs in the United States has not harmed human health, and bans in Europe have not helped human health. The fears motivating PAMTA contrast with QRA estimates of vanishingly small risks. As a case study, examining specific tetracycline uses and resistance patterns suggests that there is no significant human health hazard from continued use of tetracycline in food animals. Simple hypothetical calculations suggest an unobservably small risk (between 0 and 1.75E-11 excess lifetime risk of a tetracycline-resistant infection), based on the long history of tetracycline use in the United States without resistance-related treatment failures. QRAs for other STA uses in food animals also find that human health risks are vanishingly small. Whether such QRA calculations will guide risk management policy for animal antibiotics in the United States remains to be seen.
- Published
- 2010
- Full Text
- View/download PDF
12. Why reduced-form regression models of health effects versus exposures should not replace QRA: livestock production and infant mortality as an example.
- Author
-
Cox LA Jr
- Subjects
- Animal Husbandry statistics & numerical data, Animals, Cattle, Causality, Humans, Infant, Infant, Newborn, Models, Biological, Models, Statistical, Poultry, Regression Analysis, Risk Management statistics & numerical data, Sus scrofa, Animals, Domestic, Infant Mortality, Risk Assessment statistics & numerical data
- Abstract
Do pollution emissions from livestock operations increase infant mortality rate (IMR)? A recent regression analysis of changes in IMR against changes in aggregate "animal units" (a weighted sum of cattle, pig, and poultry numbers) over time, for counties throughout the United States, suggested the provocative conclusion that they do: "[A] doubling of production leads to a 7.4% increase in infant mortality." Yet, we find that regressing IMR changes against changes in specific components of "animal units" (cattle, pigs, and broilers) at the state level reveals statistically significant negative associations between changes in livestock production (especially, cattle production) and changes in IMR. We conclude that statistical associations between livestock variables and IMR variables are very sensitive to modeling choices (e.g., selection of explanatory variables, and use of specific animal types vs. aggregate "animal units). Such associations, whether positive or negative, do not warrant causal interpretation. We suggest that standard methods of quantitative risk assessment (QRA), including emissions release (source) models, fate and transport modeling, exposure assessment, and dose-response modeling, really are important-and indeed, perhaps, essential-for drawing valid causal inferences about health effects of exposures to guide sound, well-informed public health risk management policy. Reduced-form regression models, which skip most or all of these steps, can only quantify statistical associations (which may be due to model specification, variable selection, residual confounding, or other noncausal factors). Sound risk management requires the extra work needed to identify and model valid causal relations.
- Published
- 2009
- Full Text
- View/download PDF
13. What's wrong with hazard-ranking systems? An expository note.
- Author
-
Cox LA Jr
- Subjects
- Probability, Risk Factors, Risk Management, United States, United States Environmental Protection Agency, Environmental Monitoring, Environmental Pollutants adverse effects, Risk Assessment methods, Risk Reduction Behavior
- Abstract
Two commonly recommended principles for allocating risk management resources to remediate uncertain hazards are: (1) select a subset to maximize risk-reduction benefits (e.g., maximize the von Neumann-Morgenstern expected utility of the selected risk-reducing activities), and (2) assign priorities to risk-reducing opportunities and then select activities from the top of the priority list down until no more can be afforded. When different activities create uncertain but correlated risk reductions, as is often the case in practice, then these principles are inconsistent: priority scoring and ranking fails to maximize risk-reduction benefits. Real-world risk priority scoring systems used in homeland security and terrorism risk assessment, environmental risk management, information system vulnerability rating, business risk matrices, and many other important applications do not exploit correlations among risk-reducing opportunities or optimally diversify risk-reducing investments. As a result, they generally make suboptimal risk management recommendations. Applying portfolio optimization methods instead of risk prioritization ranking, rating, or scoring methods can achieve greater risk-reduction value for resources spent.
- Published
- 2009
- Full Text
- View/download PDF
14. Human health risk assessment of penicillin/aminopenicillin resistance in enterococci due to penicillin use in food animals.
- Author
-
Cox LA Jr, Popken DA, and Mathers JJ
- Subjects
- Animals, Anti-Bacterial Agents adverse effects, Drug Resistance, Microbial, Gram-Positive Bacterial Infections microbiology, Gram-Positive Bacterial Infections mortality, Humans, Intensive Care Units, Penicillanic Acid adverse effects, Penicillanic Acid pharmacology, Penicillins adverse effects, Risk Assessment, Animals, Domestic, Anti-Bacterial Agents pharmacology, Enterococcus faecium drug effects, Penicillanic Acid analogs & derivatives, Penicillins pharmacology
- Abstract
Penicillin and ampicillin drugs are approved for use in food animals in the United States to treat, control, and prevent diseases, and penicillin is approved for use to improve growth rates in pigs and poultry. This article considers the possibility that such uses might increase the incidence of ampicillin-resistant Enterococcus faecium (AREF) of animal origin in human infections, leading to increased hospitalization and mortality due to reduced response to ampicillin or penicillin. We assess the risks from continued use of penicillin-based drugs in food animals in the United States, using several assumptions to overcome current scientific uncertainties and data gaps. Multiplying the total at-risk population of intensive care unit (ICU) patients by a series of estimated factors suggests that not more than 0.04 excess mortalities per year (under conservative assumptions) to 0.14 excess mortalities per year (under very conservative assumptions) might be prevented in the whole U.S. population if current use of penicillin drugs in food animals were discontinued and if this successfully reduced the prevalence of AREF infections among ICU patients. These calculations suggest that current penicillin usage in food animals in the United States presents very low (possibly zero) human health risks.
- Published
- 2009
- Full Text
- View/download PDF
15. A mathematical model of protease-antiprotease homeostasis failure in chronic obstructive pulmonary disease (COPD).
- Author
-
Cox LA Jr
- Subjects
- Humans, Pulmonary Disease, Chronic Obstructive enzymology, Pulmonary Disease, Chronic Obstructive metabolism, Pulmonary Disease, Chronic Obstructive physiopathology, Homeostasis, Models, Theoretical, Peptide Hydrolases metabolism, Protease Inhibitors metabolism, Pulmonary Disease, Chronic Obstructive drug therapy
- Abstract
Chronic obstructive pulmonary disease (COPD), the fourth leading cause of death worldwide, has a puzzling etiology. Although it is a smoking-associated disease, only a minority of smokers develop it. Moreover, the disease continues to progress in COPD patients, even after smoking ceases. This article proposes a mathematical model of COPD that offers one possible explanation for both observations. Building on a conceptual model of COPD causation as resulting from protease-antiprotease imbalance in the lung, leading to ongoing proteolysis (digestion) of lung tissue by excess proteases, we formulate a system of seven ordinary differential equations (ODEs) with 18 parameters to describe the network of interacting homeostatic processes regulating the levels of key proteases (macrophage elastase (MMP-12) and neutrophil elastase (NE)) and antiproteases (alpha-1-antitrypsin and tissue inhibitor of metalloproteinase-1). We show that this system can be simplified to a single quadratic equation with only two parameters to predict the equilibrium behavior of the entire network. The model predicts two possible equilibrium behaviors: a unique stable "normal" (healthy) equilibrium or a "COPD" equilibrium with elevated levels of MMP-12 and NE (and of lung macrophages and neutrophils) and reduced levels of antiproteases. The COPD equilibrium is induced in the model only if cigarette smoking increases the average production of MMP-12 per alveolar macrophage above a certain threshold. Following smoking cessation, the predicted COPD equilibrium levels of MMP-12 and other disease markers decline, but do not return to their original (presmoking) levels. These and other predictions of the model are consistent with limited available human data.
- Published
- 2009
- Full Text
- View/download PDF
16. Hormesis without cell killing.
- Author
-
Cox LA Jr
- Subjects
- Animals, Cell Death drug effects, Cell Proliferation drug effects, Dose-Response Relationship, Drug, Humans, Poisson Distribution, Risk, Stochastic Processes, Cell Transformation, Neoplastic drug effects, Lung Neoplasms pathology, Models, Biological
- Abstract
Stochastic two-stage clonal expansion (TSCE) models of carcinogenesis offer the following clear theoretical explanation for U-shaped cancer dose-response relations. Low doses that kill initiated (premalignant) cells thereby create a protective effect. At higher doses, this effect is overwhelmed by an increase in the net number of initiated cells. The sum of these two effects, from cell killing and cell proliferation, gives a U-shaped or J-shaped dose-response relation. This article shows that exposures that do not kill, repair, or decrease cell populations, but that only hasten transitions that lead to cancer, can also generate U-shaped and J-shaped dose-response relations in a competing-risk (modified TSCE) framework where exposures disproportionately hasten transitions into carcinogenic pathways with relatively long times to tumor. Quantitative modeling of the competing effects of more transitions toward carcinogenesis (risk increasing) and a higher proportion of transitions into the slower pathway (risk reducing) shows that a J-shaped dose-response relation can occur even if exposure increases the number of initiated cells at every positive dose level. This suggests a possible new explanation for hormetic dose-response relations in response to carcinogenic exposures that do not have protective (cell-killing) effects. In addition, the examples presented emphasize the role of time in hormesis: exposures that monotonically increase risks at younger ages may nonetheless produce a U-shaped or J-shaped dose-response relation for lifetime risk of cancer.
- Published
- 2009
- Full Text
- View/download PDF
17. Some limitations of "Risk = Threat x Vulnerability x Consequence" for risk analysis of terrorist attacks.
- Author
-
Cox LA Jr
- Abstract
Several important risk analysis methods now used in setting priorities for protecting U.S. infrastructures against terrorist attacks are based on the formula: Risk = Threat x Vulnerability x Consequence. This article identifies potential limitations in such methods that can undermine their ability to guide resource allocations to effectively optimize risk reductions. After considering specific examples for the Risk Analysis and Management for Critical Asset Protection (RAMCAP) framework used by the Department of Homeland Security, we address more fundamental limitations of the product formula. These include its failure to adjust for correlations among its components, nonadditivity of risks estimated using the formula, inability to use risk-scoring results to optimally allocate defensive resources, and intrinsic subjectivity and ambiguity of Threat, Vulnerability, and Consequence numbers. Trying to directly assess probabilities for the actions of intelligent antagonists instead of modeling how they adaptively pursue their goals in light of available information and experience can produce ambiguous or mistaken risk estimates. Recent work demonstrates that two-level (or few-level) hierarchical optimization models can provide a useful alternative to Risk = Threat x Vulnerability x Consequence scoring rules, and also to probabilistic risk assessment (PRA) techniques that ignore rational planning and adaptation. In such two-level optimization models, defender predicts attacker's best response to defender's own actions, and then chooses his or her own actions taking into account these best responses. Such models appear valuable as practical approaches to antiterrorism risk analysis.
- Published
- 2008
- Full Text
- View/download PDF
18. Overcoming confirmation bias in causal attribution: a case study of antibiotic resistance risks.
- Author
-
Cox LA Jr and Popken DA
- Subjects
- Animals, Bayes Theorem, Bird Diseases drug therapy, Food Supply, Humans, Models, Statistical, Poultry microbiology, Risk Assessment methods, Virginiamycin therapeutic use, Bias, Causality, Drug Resistance, Microbial
- Abstract
When they do not use formal quantitative risk assessment methods, many scientists (like other people) make mistakes and exhibit biases in reasoning about causation, if-then relations, and evidence. Decision-related conclusions or causal explanations are reached prematurely based on narrative plausibility rather than adequate factual evidence. Then, confirming evidence is sought and emphasized, but disconfirming evidence is ignored or discounted. This tendency has serious implications for health-related public policy discussions and decisions. We provide examples occurring in antimicrobial health risk assessments, including a case study of a recently reported positive relation between virginiamycin (VM) use in poultry and risk of resistance to VM-like (streptogramin) antibiotics in humans. This finding has been used to argue that poultry consumption causes increased resistance risks, that serious health impacts may result, and therefore use of VM in poultry should be restricted. However, the original study compared healthy vegetarians to hospitalized poultry consumers. Our examination of the same data using conditional independence tests for potential causality reveals that poultry consumption acted as a surrogate for hospitalization in this study. After accounting for current hospitalization status, no evidence remains supporting a causal relationship between poultry consumption and increased streptogramin resistance. This example emphasizes both the importance and the practical possibility of analyzing and presenting quantitative risk information using data analysis techniques (such as Bayesian model averaging (BMA) and conditional independence tests) that are as free as possible from potential selection, confirmation, and modeling biases.
- Published
- 2008
- Full Text
- View/download PDF
19. Why risk is not variance: an expository note.
- Author
-
Cox LA Jr
- Subjects
- Decision Support Techniques, Probability, Risk
- Abstract
Variance (or standard deviation) of return is widely used as a measure of risk in financial investment risk analysis applications, where mean-variance analysis is applied to calculate efficient frontiers and undominated portfolios. Why, then, do health, safety, and environmental (HS&E) and reliability engineering risk analysts insist on defining risk more flexibly, as being determined by probabilities and consequences, rather than simply by variances? This note suggests an answer by providing a simple proof that mean-variance decision making violates the principle that a rational decisionmaker should prefer higher to lower probabilities of receiving a fixed gain, all else being equal. Indeed, simply hypothesizing a continuous increasing indifference curve for mean-variance combinations at the origin is enough to imply that a decisionmaker must find unacceptable some prospects that offer a positive probability of gain and zero probability of loss. Unlike some previous analyses of limitations of variance as a risk metric, this expository note uses only simple mathematics and does not require the additional framework of von Neumann Morgenstern utility theory.
- Published
- 2008
- Full Text
- View/download PDF
20. Does concern-driven risk management provide a viable alternative to QRA?
- Author
-
Cox LA Jr
- Subjects
- Bacterial Infections epidemiology, Decision Making, Drug Resistance, Bacterial, Humans, Models, Theoretical, Perception, Public Policy, Uncertainty, Risk Assessment, Risk Management methods
- Abstract
This article discusses a concept of concern-driven risk management, in which qualitative expert judgments about whether concerns warrant specified risk management interventions are used in preference to quantitative risk assessment (QRA) to guide risk management decisions. Where QRA emphasizes formal quantitative assessment of the probable consequences caused by the recommended actions, and comparison to the probable consequences of alternatives, including the status quo, concern-driven risk management instead emphasizes perceived urgency or severity of the situation motivating recommended interventions. In many instances, especially those involving applications of the precautionary principle, no formal quantification or comparison of probable consequences for alternative decisions is seen as being necessary (or, perhaps, possible or desirable) prior to implementation of risk management measures. Such concern-driven risk management has been recommended by critics of QRA in several areas of applied risk management. Based on case studies and psychological literature on the empirical performance of judgment-based approaches to decision making under risk and uncertainty, we conclude that, although concern-driven risk management has several important potential political and psychological advantages over QRA, it is not clear that it performs better than (or as well as) QRA in identifying risk management interventions that successfully protect human health or achieve other desired consequences. Therefore, those who advocate replacing QRA with concern-driven alternatives, such as expert judgment and consensus decision processes, should assess whether their recommended alternatives truly outperform QRA, by the criterion of producing preferred consequences, before rejecting the QRA paradigm for practical applications.
- Published
- 2007
- Full Text
- View/download PDF
21. Quantifying potential health impacts of cadmium in cigarettes on smoker risk of lung cancer: a portfolio-of-mechanisms approach.
- Author
-
Cox LA Jr
- Subjects
- Animals, Cell Proliferation drug effects, DNA Repair drug effects, Disease Models, Animal, Genetic Predisposition to Disease, Humans, Lung Neoplasms chemically induced, Lung Neoplasms etiology, Lung Neoplasms genetics, Polymorphism, Genetic, Rats, Reactive Oxygen Species, Risk, Risk Assessment methods, Stochastic Processes, Cadmium toxicity, Smoking adverse effects
- Abstract
This article introduces an approach to estimating the uncertain potential effects on lung cancer risk of removing a particular constituent, cadmium (Cd), from cigarette smoke, given the useful but incomplete scientific information available about its modes of action. The approach considers normal cell proliferation; DNA repair inhibition in normal cells affected by initiating events; proliferation, promotion, and progression of initiated cells; and death or sparing of initiated and malignant cells as they are further transformed to become fully tumorigenic. Rather than estimating unmeasured model parameters by curve fitting to epidemiological or animal experimental tumor data, we attempt rough estimates of parameters based on their biological interpretations and comparison to corresponding genetic polymorphism data. The resulting parameter estimates are admittedly uncertain and approximate, but they suggest a portfolio approach to estimating impacts of removing Cd that gives usefully robust conclusions. This approach views Cd as creating a portfolio of uncertain health impacts that can be expressed as biologically independent relative risk factors having clear mechanistic interpretations. Because Cd can act through many distinct biological mechanisms, it appears likely (subjective probability greater than 40%) that removing Cd from cigarette smoke would reduce smoker risks of lung cancer by at least 10%, although it is possible (consistent with what is known) that the true effect could be much larger or smaller. Conservative estimates and assumptions made in this calculation suggest that the true impact could be greater for some smokers. This conclusion appears to be robust to many scientific uncertainties about Cd and smoking effects.
- Published
- 2006
- Full Text
- View/download PDF
22. Estimating preventable fractions of disease caused by a specified biological mechanism: PAHs in smoking lung cancers as an example.
- Author
-
Cox LA Jr and Sanders E
- Subjects
- 7,8-Dihydro-7,8-dihydroxybenzo(a)pyrene 9,10-oxide metabolism, DNA Adducts drug effects, DNA Adducts metabolism, Genes, p53 drug effects, Humans, Lung Neoplasms genetics, Lung Neoplasms metabolism, Models, Biological, Models, Statistical, Mutation, Risk Assessment, Lung Neoplasms etiology, Lung Neoplasms prevention & control, Polycyclic Aromatic Hydrocarbons toxicity, Smoking adverse effects
- Abstract
Epidemiology textbooks often interpret population attributable fractions based on 2 x 2 tables or logistic regression models of exposure-response associations as preventable fractions, i.e., as fractions of illnesses in a population that would be prevented if exposure were removed. In general, this causal interpretation is not correct, since statistical association need not indicate causation; moreover, it does not identify how much risk would be prevented by removing specific constituents of complex exposures. This article introduces and illustrates an approach to calculating useful bounds on preventable fractions, having valid causal interpretations, from the types of partial but useful molecular epidemiological and biological information often available in practice. The method applies probabilistic risk assessment concepts from systems reliability analysis, together with bounding constraints for the relationship between event probabilities and causation (such as that the probability that exposure X causes response Y cannot exceed the probability that exposure X precedes response Y, or the probability that both X and Y occur) to bound the contribution to causation from specific causal pathways. We illustrate the approach by estimating an upper bound on the contribution to lung cancer risk made by a specific, much-discussed causal pathway that links smoking to a polycyclic aromatic hydrocarbon (PAH) (specifically, benzo(a)pyrene diol epoxide-DNA) adducts at hot spot codons at p53 in lung cells. The result is a surprisingly small preventable fraction (of perhaps 7% or less) for this pathway, suggesting that it will be important to consider other mechanisms and non-PAH constituents of tobacco smoke in designing less risky tobacco-based products.
- Published
- 2006
- Full Text
- View/download PDF
23. Some limitations of a proposed linear model for antimicrobial risk management.
- Author
-
Cox LA Jr
- Subjects
- Animals, Campylobacter drug effects, Campylobacter isolation & purification, Campylobacter pathogenicity, Campylobacter Infections etiology, Chickens microbiology, Drug Resistance, Bacterial, Fluoroquinolones pharmacology, Foodborne Diseases etiology, Humans, Linear Models, Meat microbiology, Risk Assessment, United States, United States Food and Drug Administration, Food Microbiology legislation & jurisprudence, Risk Management statistics & numerical data
- Abstract
The FDA Center for Veterinary Medicine (CVM) (Bartholomew et al., 2005) recently proposed an approach to risk management based on the linear modeling framework: Risk = K x Exposure. They suggest that, once K has been estimated from historical data, it can be used to predict how limiting future exposure will reduce future risk. They illustrate the approach for fluoroquinolone-resistant campylobacter in chicken. However, despite its appealing simplicity, the proposed approach confuses a possibly meaningless descriptive statistical ratio with a valid predictive causal relation. In general, the historical ratio K = (Risk/Exposure) may not predict how changing future exposures will affect future risks, and hence it does not necessarily provide an appropriate guide to current risk management actions. We identify several limitations of the proposed framework, including omission of frequency and severity of human health harm in quantifying "Risk" and omission of microbial load from "Exposure." Finally, we show that an extended linear modeling approach that considers impacts of changing animal antibiotic use on susceptible as well as on resistant bacteria is consistent with the conclusion that reducing "Exposure" can greatly increase "Risk."
- Published
- 2005
- Full Text
- View/download PDF
24. Some limitations of qualitative risk rating systems.
- Author
-
Cox LA Jr, Babayev D, and Huber W
- Subjects
- Animals, Animals, Domestic, Anti-Bacterial Agents therapeutic use, Humans, Probability, Anti-Bacterial Agents adverse effects, Food Contamination, Risk Assessment
- Abstract
Qualitative systems for rating animal antimicrobial risks using ordered categorical labels such as "high,""medium," and "low" can potentially simplify risk assessment input requirements used to inform risk management decisions. But do they improve decisions? This article compares the results of qualitative and quantitative risk assessment systems and establishes some theoretical limitations on the extent to which they are compatible. In general, qualitative risk rating systems satisfying conditions found in real-world rating systems and guidance documents and proposed as reasonable make two types of errors: (1) Reversed rankings, i.e., assigning higher qualitative risk ratings to situations that have lower quantitative risks; and (2) Uninformative ratings, e.g., frequently assigning the most severe qualitative risk label (such as "high") to situations with arbitrarily small quantitative risks and assigning the same ratings to risks that differ by many orders of magnitude. Therefore, despite their appealing consensus-building properties, flexibility, and appearance of thoughtful process in input requirements, qualitative rating systems as currently proposed often do not provide sufficient information to discriminate accurately between quantitatively small and quantitatively large risks. The value of information (VOI) that they provide for improving risk management decisions can be zero if most risks are small but a few are large, since qualitative ratings may then be unable to confidently distinguish the large risks from the small. These limitations suggest that it is important to continue to develop and apply practical quantitative risk assessment methods, since qualitative ones are often unreliable.
- Published
- 2005
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.