191 results on '"Rubin DB"'
Search Results
52. Randomization to randomization probability: Estimating treatment effects under actual conditions of use.
- Author
-
George BJ, Li P, Lieberman HR, Pavela G, Brown AW, Fontaine KR, Jeansonne MM, Dutton GR, Idigo AJ, Parman MA, Rubin DB, and Allison DB
- Subjects
- Adult, Affect drug effects, Arousal drug effects, Caffeine pharmacology, Central Nervous System Stimulants pharmacology, Humans, Biomedical Research methods, Outcome Assessment, Health Care methods, Random Allocation, Randomized Controlled Trials as Topic methods, Research Design
- Abstract
Blinded randomized controlled trials (RCT) require participants to be uncertain if they are receiving a treatment or placebo. Although uncertainty is ideal for isolating the treatment effect from all other potential effects, it is poorly suited for estimating the treatment effect under actual conditions of intended use-when individuals are certain that they are receiving a treatment. We propose an experimental design, randomization to randomization probabilities (R2R), which significantly improves estimates of treatment effects under actual conditions of use by manipulating participant expectations about receiving treatment. In the R2R design, participants are first randomized to a value, π, denoting their probability of receiving treatment (vs. placebo). Subjects are then told their value of π and randomized to either treatment or placebo with probabilities π and 1-π, respectively. Analysis of the treatment effect includes statistical controls for π (necessary for causal inference) and typically a π-by-treatment interaction. Random assignment of subjects to π and disclosure of its value to subjects manipulates subject expectations about receiving the treatment without deception. This method offers a better treatment effect estimate under actual conditions of use than does a conventional RCT. Design properties, guidelines for power analyses, and limitations of the approach are discussed. We illustrate the design by implementing an RCT of caffeine effects on mood and vigilance and show that some of the actual effects of caffeine differ by the expectation that one is receiving the active drug. (PsycINFO Database Record, ((c) 2018 APA, all rights reserved).)
- Published
- 2018
- Full Text
- View/download PDF
53. The Dynamical Regime of Sensory Cortex: Stable Dynamics around a Single Stimulus-Tuned Attractor Account for Patterns of Noise Variability.
- Author
-
Hennequin G, Ahmadian Y, Rubin DB, Lengyel M, and Miller KD
- Subjects
- Animals, Macaca, Neural Inhibition physiology, Neural Networks, Computer, Nonlinear Dynamics, Occipital Lobe cytology, Occipital Lobe physiology, Visual Cortex cytology, Neurons physiology, Visual Cortex physiology
- Abstract
Correlated variability in cortical activity is ubiquitously quenched following stimulus onset, in a stimulus-dependent manner. These modulations have been attributed to circuit dynamics involving either multiple stable states ("attractors") or chaotic activity. Here we show that a qualitatively different dynamical regime, involving fluctuations about a single, stimulus-driven attractor in a loosely balanced excitatory-inhibitory network (the stochastic "stabilized supralinear network"), best explains these modulations. Given the supralinear input/output functions of cortical neurons, increased stimulus drive strengthens effective network connectivity. This shifts the balance from interactions that amplify variability to suppressive inhibitory feedback, quenching correlated variability around more strongly driven steady states. Comparing to previously published and original data analyses, we show that this mechanism, unlike previous proposals, uniquely accounts for the spatial patterns and fast temporal dynamics of variability suppression. Specifying the cortical operating regime is key to understanding the computations underlying perception., (Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.)
- Published
- 2018
- Full Text
- View/download PDF
54. Cerebrovascular Accidents During Mechanical Circulatory Support: New Predictors of Ischemic and Hemorrhagic Strokes and Outcome.
- Author
-
Izzy S, Rubin DB, Ahmed FS, Akbik F, Renault S, Sylvester KW, Vaitkevicius H, Smallwood JA, Givertz MM, and Feske SK
- Subjects
- Aged, Anticoagulants therapeutic use, Aspirin therapeutic use, Female, Humans, Incidence, International Normalized Ratio, Intracranial Hemorrhages epidemiology, Male, Middle Aged, Platelet Aggregation Inhibitors therapeutic use, Quality of Life, Retrospective Studies, Risk Factors, Warfarin therapeutic use, Brain Ischemia epidemiology, Cerebral Hemorrhage epidemiology, Heart Failure therapy, Heart-Assist Devices, Stroke epidemiology
- Abstract
Background and Purpose: Left ventricular assist devices (LVADs) have emerged as an effective treatment for patients with advanced heart failure refractory to medical therapy. Post-LVAD strokes are an important cause of morbidity and reduced quality of life. Data on risks that distinguish between ischemic and hemorrhagic post-LVAD strokes are limited. The aim of this study was to determine the incidence of post-LVAD ischemic and hemorrhagic strokes, their association with stroke risk factors, and their effect on mortality., Methods: Data are collected prospectively on all patients with LVADs implanted at Brigham and Women's Hospital. We added retrospectively collected clinical data for these analyses., Results: From 2007 to 2016, 183 patients (median age, 57; 80% male) underwent implantation of HeartMate II LVAD as a bridge to transplant (52%), destination therapy (39%), or bridge to transplant candidacy (8%). A total of 48 strokes occurred in 39 patients (21%): 28 acute ischemic strokes in 24 patients (13%) and 20 intracerebral hemorrhages in 19 patients (10.3%). First events occurred at a median of 238 days from implantation (interquartile range, 93-515) among those who developed post-LVAD stroke. All but 9 patients (4.9%) were on warfarin (goal international normalized ratio, 2-3.5) and all received aspirin (81-325 mg). Patients with chronic obstructive pulmonary disease were more likely to have an ischemic stroke (odds ratio, 2.96; 95% confidence interval, 1.14-7.70). Dialysis-dependent patients showed a trend toward a higher risk of hemorrhagic stroke (odds ratio, 6.31; 95% confidence interval, 0.99-40.47). Hemorrhagic stroke was associated with higher mortality (odds ratio, 3.92; 95% confidence interval, 1.34-11.45) than ischemic stroke (odds ratio, 3.17; 95% confidence interval, 1.13-8.85)., Conclusions: Stroke is a major cause of morbidity and mortality in patients on LVAD support. Chronic obstructive pulmonary disease increases the risk of ischemic stroke, whereas dialysis may increase the risk of hemorrhagic stroke. Although any stroke increases mortality, post-LVAD hemorrhagic stroke was associated with higher mortality compared with ischemic stroke., (© 2018 American Heart Association, Inc.)
- Published
- 2018
- Full Text
- View/download PDF
55. Autoimmune Neurologic Disorders.
- Author
-
Rubin DB, Batra A, Vaitkevicius H, and Vodopivec I
- Subjects
- Autoimmune Diseases therapy, Humans, Nervous System Diseases therapy, Autoimmune Diseases diagnosis, Autoimmune Diseases etiology, Nervous System Diseases diagnosis, Nervous System Diseases etiology
- Abstract
The practice of autoimmune neurology focuses on the diagnosis and treatment of a wide spectrum of neurological conditions driven by abnormal immune responses directed against neural tissues. These include autoimmune, paraneoplastic, postinfectious, and iatrogenic conditions. Symptoms of autoimmune neurologic disorders can be diverse and often difficult to recognize in their early stages, complicating the diagnosis. This review discusses the classification and management of common autoimmune neurological conditions, placing an emphasis on the rapid identification of autoimmune etiology and mechanism of immune dysfunction to allow for the timely institution of appropriate treatment., (Copyright © 2018 Elsevier Inc. All rights reserved.)
- Published
- 2018
- Full Text
- View/download PDF
56. Autoimmune Encephalitis in Critical Care: Optimizing Immunosuppression.
- Author
-
Rubin DB, Batra A, Vodopivec I, and Vaitkevicius H
- Subjects
- Autoimmunity immunology, Encephalitis diagnosis, Encephalitis physiopathology, Hashimoto Disease diagnosis, Hashimoto Disease physiopathology, Humans, Intensive Care Units, Critical Care methods, Encephalitis therapy, Hashimoto Disease therapy, Immunosuppression Therapy methods
- Abstract
Autoimmune diseases affecting the nervous systems are a common cause of admission to the intensive care unit (ICU). Although there exist several well-described clinical syndromes, patients more commonly present with progressive neurologic dysfunction and laboratory and radiographic evidence of central nervous system (CNS) inflammation. In the critical care setting, the urgency to intervene to prevent permanent damage to the nervous system and secondary injury from the systemic manifestations of these syndromes often conflicts with diagnostic uncertainty. Furthermore, treatment is limited by current therapeutic agents that remain non-specific for individual diseases, especially for those whose pathophysiology remains unclear. Primary autoimmune, paraneoplastic, parainfectious, and iatrogenic neurologic disorders all share the common underlying pathophysiology of an adaptive immune response directed against an antigen within the nervous system. Several different mechanisms of immune dysfunction are responsible for pathogenesis within each of these categories of disease, and it is at this level of pathophysiology that the most effective and appropriate therapeutic decisions are made. In this review, we outline the basic diagnostic and therapeutic principles in the management of autoimmune diseases of the nervous system in the ICU. We approach these disorders not as lists of distinct clinical syndromes or molecular targets of autoimmunity but rather as clusters of syndromes based on these common underlying mechanisms of immune dysfunction. This approach emphasizes early intervention over precise diagnosis. As our understanding of the immune system continues to grow, this framework will allow for a more sophisticated approach to the management of patients with these complex, often devastating but frequently reversible, neurologic illnesses., Competing Interests: Disclosure The authors report no conflicts of interest in this work., (Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.)
- Published
- 2017
- Full Text
- View/download PDF
57. Mepolizumab for Eosinophilic Chronic Obstructive Pulmonary Disease.
- Author
-
Pavord ID, Chanez P, Criner GJ, Kerstjens HAM, Korn S, Lugogo N, Martinot JB, Sagara H, Albers FC, Bradford ES, Harris SS, Mayer B, Rubin DB, Yancey SW, and Sciurba FC
- Subjects
- Adult, Aged, Antibodies, Monoclonal, Humanized adverse effects, Antibodies, Monoclonal, Humanized immunology, Biomarkers blood, Dose-Response Relationship, Drug, Double-Blind Method, Drug Therapy, Combination, Female, Glucocorticoids therapeutic use, Humans, Injections, Subcutaneous, Intention to Treat Analysis, Leukocyte Count, Male, Middle Aged, Pulmonary Disease, Chronic Obstructive immunology, Antibodies, Monoclonal, Humanized therapeutic use, Eosinophils, Pulmonary Disease, Chronic Obstructive drug therapy
- Abstract
Background: Patients with chronic obstructive pulmonary disease (COPD) with an eosinophilic phenotype may benefit from treatment with mepolizumab, a monoclonal antibody directed against interleukin-5., Methods: We performed two phase 3, randomized, placebo-controlled, double-blind, parallel-group trials comparing mepolizumab (100 mg in METREX, 100 or 300 mg in METREO) with placebo, given as a subcutaneous injection every 4 weeks for 52 weeks in patients with COPD who had a history of moderate or severe exacerbations while taking inhaled glucocorticoid-based triple maintenance therapy. In METREX, unselected patients in the modified intention-to-treat population with an eosinophilic phenotype were stratified according to blood eosinophil count (≥150 per cubic millimeter at screening or ≥300 per cubic millimeter during the previous year). In METREO, all patients had a blood eosinophil count of at least 150 per cubic millimeter at screening or at least 300 per cubic millimeter during the previous year. The primary end point was the annual rate of moderate or severe exacerbations. Safety was also assessed., Results: In METREX, the mean annual rate of moderate or severe exacerbations in the modified intention-to-treat population with an eosinophilic phenotype (462 patients) was 1.40 per year in the mepolizumab group versus 1.71 per year in the placebo group (rate ratio, 0.82; 95% confidence interval [CI], 0.68 to 0.98; adjusted P=0.04); no significant between-group differences were found in the overall modified intention-to-treat population (836 patients) (rate ratio, 0.98; 95% CI, 0.85 to 1.12; adjusted P>0.99). In METREO, the mean annual rate of moderate or severe exacerbations was 1.19 per year in the 100-mg mepolizumab group, 1.27 per year in the 300-mg mepolizumab group, and 1.49 per year in the placebo group. The rate ratios for exacerbations in the 100-mg and 300-mg mepolizumab groups versus the placebo group were 0.80 (95% CI, 0.65 to 0.98; adjusted P=0.07) and 0.86 (95% CI, 0.70 to 1.05; adjusted P=0.14), respectively. A greater effect of mepolizumab, as compared with placebo, on the annual rate of moderate or severe exacerbations was found among patients with higher blood eosinophil counts at screening. The safety profile of mepolizumab was similar to that of placebo., Conclusions: Mepolizumab at a dose of 100 mg was associated with a lower annual rate of moderate or severe exacerbations than placebo among patients with COPD and an eosinophilic phenotype. This finding suggests that eosinophilic airway inflammation contributes to COPD exacerbations. (Funded by GlaxoSmithKline; METREX and METREO ClinicalTrials.gov numbers, NCT02105948 and NCT02105961 .).
- Published
- 2017
- Full Text
- View/download PDF
58. Estimation of causal effects of binary treatments in unconfounded studies with one continuous covariate.
- Author
-
Gutman R and Rubin DB
- Subjects
- Humans, Linear Models, Treatment Outcome, Controlled Clinical Trials as Topic methods
- Abstract
The estimation of causal effects in nonrandomized studies should comprise two distinct phases: design, with no outcome data available; and analysis of the outcome data according to a specified protocol. Here, we review and compare point and interval estimates of common statistical procedures for estimating causal effects (i.e. matching, subclassification, weighting, and model-based adjustment) with a scalar continuous covariate and a scalar continuous outcome. We show, using an extensive simulation, that some highly advocated methods have poor operating characteristics. In many conditions, matching for the point estimate combined with within-group matching for sampling variance estimation, with or without covariance adjustment, appears to be the most efficient valid method of those evaluated. These results provide new conclusions and advice regarding the merits of currently used procedures.
- Published
- 2017
- Full Text
- View/download PDF
59. Anterior Temporal Lobectomy for Refractory Status Epilepticus in Herpes Simplex Encephalitis.
- Author
-
Bick SK, Izzy S, Rubin DB, Zafar SF, Rosenthal ES, and Eskandar EN
- Subjects
- Drug Resistant Epilepsy etiology, Humans, Male, Middle Aged, Status Epilepticus etiology, Anterior Temporal Lobectomy methods, Drug Resistant Epilepsy surgery, Encephalitis, Herpes Simplex complications, Status Epilepticus surgery
- Abstract
Background: Herpes simplex virus (HSV) is a common cause of viral encephalitis that can lead to refractory seizures. The primary treatment of HSV encephalitis is with acyclovir; however, surgery sometimes plays a role in obtaining tissue diagnosis or decompression in cases with severe mass effect. We report a unique case in which anterior temporal lobectomy was successfully used to treat refractory status epilepticus in HSV encephalitis., Methods: Case report and review of the literature., Results: We report a case of a 60-year-old man with HSV encephalitis, who presented with seizures originating from the right temporal lobe refractory to maximal medical management. Right anterior temporal lobectomy was performed for the purpose of treatment of refractory status epilepticus and obtaining tissue diagnosis, with ultimate resolution of seizures and excellent functional outcome., Conclusions: We suggest that anterior temporal lobectomy should be considered in cases of HSV encephalitis with refractory status epilepticus with clear unilateral origin.
- Published
- 2016
- Full Text
- View/download PDF
60. Evaluations of the Optimal Discovery Procedure for Multiple Testing.
- Author
-
Rubin DB
- Subjects
- Biostatistics methods, Data Interpretation, Statistical, Microarray Analysis methods, Models, Statistical, Research Design standards
- Abstract
The Optimal Discovery Procedure (ODP) is a method for simultaneous hypothesis testing that attempts to gain power relative to more standard techniques by exploiting multivariate structure [1]. Specializing to the example of testing whether components of a Gaussian mean vector are zero, we compare the power of the ODP to a Bonferroni-style method and to the Benjamini-Hochberg method when the testing procedures aim to respectively control certain Type I error rate measures, such as the expected number of false positives or the false discovery rate. We show through theoretical results, numerical comparisons, and two microarray examples that when the rejection regions for the ODP test statistics are chosen such that the procedure is guaranteed to uniformly control a Type I error rate measure, the technique is generally less powerful than competing methods. We contrast and explain these results in light of previously proven optimality theory for the ODP. We also compare the ordering given by the ODP test statistics to the standard rankings based on sorting univariate p-values from smallest to largest. In the cases we considered the standard ordering was superior, and ODP rankings were adversely impacted by correlation.
- Published
- 2016
- Full Text
- View/download PDF
61. Emerging Cases of Powassan Virus Encephalitis in New England: Clinical Presentation, Imaging, and Review of the Literature.
- Author
-
Piantadosi A, Rubin DB, McQuillen DP, Hsu L, Lederer PA, Ashbaugh CD, Duffalo C, Duncan R, Thon J, Bhattacharyya S, Basgoz N, Feske SK, and Lyons JL
- Subjects
- Acyclovir therapeutic use, Adult, Aged, Aged, 80 and over, Animals, Antibodies, Viral cerebrospinal fluid, Antiviral Agents therapeutic use, Brain diagnostic imaging, Brain pathology, Brain virology, Encephalitis, Tick-Borne diagnosis, Encephalitis, Tick-Borne virology, Female, Humans, Ixodes virology, Magnetic Resonance Imaging, Male, Massachusetts epidemiology, Meningitis, Bacterial drug therapy, Middle Aged, New Hampshire epidemiology, Prevalence, Seasons, United States epidemiology, Young Adult, Encephalitis Viruses, Tick-Borne drug effects, Encephalitis Viruses, Tick-Borne immunology, Encephalitis Viruses, Tick-Borne pathogenicity, Encephalitis, Tick-Borne diagnostic imaging, Encephalitis, Tick-Borne epidemiology, Flavivirus drug effects, Flavivirus immunology, Flavivirus pathogenicity
- Abstract
Background: Powassan virus (POWV) is a rarely diagnosed cause of encephalitis in the United States. In the Northeast, it is transmitted by Ixodes scapularis, the same vector that transmits Lyme disease. The prevalence of POWV among animal hosts and vectors has been increasing. We present 8 cases of POWV encephalitis from Massachusetts and New Hampshire in 2013-2015., Methods: We abstracted clinical and epidemiological information for patients with POWV encephalitis diagnosed at 2 hospitals in Massachusetts from 2013 to 2015. We compared their brain imaging with those in published findings from Powassan and other viral encephalitides., Results: The patients ranged in age from 21 to 82 years, were, for the most part, previously healthy, and presented with syndromes of fever, headache, and altered consciousness. Infections occurred from May to September and were often associated with known tick exposures. In all patients, cerebrospinal fluid analyses showed pleocytosis with elevated protein. In 7 of 8 patients, brain magnetic resonance imaging demonstrated deep foci of increased T2/fluid-attenuation inversion recovery signal intensity., Conclusions: We describe 8 cases of POWV encephalitis in Massachusetts and New Hampshire in 2013-2015. Prior to this, there had been only 2 cases of POWV encephalitis identified in Massachusetts. These cases may represent emergence of this virus in a region where its vector, I. scapularis, is known to be prevalent or may represent the emerging diagnosis of an underappreciated pathogen. We recommend testing for POWV in patients who present with encephalitis in the spring to fall in New England., (© The Author 2015. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.)
- Published
- 2016
- Full Text
- View/download PDF
62. Fisher, Neyman, and Bayes at FDA.
- Author
-
Rubin DB
- Subjects
- Humans, United States, Bayes Theorem, Statistics as Topic, United States Food and Drug Administration
- Abstract
The wise use of statistical ideas in practice essentially requires some Bayesian thinking, in contrast to the classical rigid frequentist dogma. This dogma too often has seemed to influence the applications of statistics, even at agencies like the FDA. Greg Campbell was one of the most important advocates there for more nuanced modes of thought, especially Bayesian statistics. Because two brilliant statisticians, Ronald Fisher and Jerzy Neyman, are often credited with instilling the traditional frequentist approach in current practice, I argue that both men were actually seeking very Bayesian answers, and neither would have endorsed the rigid application of their ideas.
- Published
- 2016
- Full Text
- View/download PDF
63. Estimation of causal effects of binary treatments in unconfounded studies.
- Author
-
Gutman R and Rubin DB
- Subjects
- Observational Studies as Topic, Randomized Controlled Trials as Topic, Treatment Outcome, Models, Statistical, Research Design, Therapeutics
- Abstract
Estimation of causal effects in non-randomized studies comprises two distinct phases: design, without outcome data, and analysis of the outcome data according to a specified protocol. Recently, Gutman and Rubin (2013) proposed a new analysis-phase method for estimating treatment effects when the outcome is binary and there is only one covariate, which viewed causal effect estimation explicitly as a missing data problem. Here, we extend this method to situations with continuous outcomes and multiple covariates and compare it with other commonly used methods (such as matching, subclassification, weighting, and covariance adjustment). We show, using an extensive simulation, that of all methods considered, and in many of the experimental conditions examined, our new 'multiple-imputation using two subclassification splines' method appears to be the most efficient and has coverage levels that are closest to nominal. In addition, it can estimate finite population average causal effects as well as non-linear causal estimands. This type of analysis also allows the identification of subgroups of units for which the effect appears to be especially beneficial or harmful., (Copyright © 2015 John Wiley & Sons, Ltd.)
- Published
- 2015
- Full Text
- View/download PDF
64. Valid randomization-based p-values for partially post hoc subgroup analyses.
- Author
-
Lee JJ and Rubin DB
- Subjects
- Biometry, Computer Simulation, Equipment and Supplies, Gels therapeutic use, Humans, Osteoarthritis, Knee drug therapy, United States, United States Food and Drug Administration, Patient Selection, Randomized Controlled Trials as Topic methods, Research Design
- Abstract
By 'partially post-hoc' subgroup analyses, we mean analyses that compare existing data from a randomized experiment-from which a subgroup specification is derived-to new, subgroup-only experimental data. We describe a motivating example in which partially post hoc subgroup analyses instigated statistical debate about a medical device's efficacy. We clarify the source of such analyses' invalidity and then propose a randomization-based approach for generating valid posterior predictive p-values for such partially post hoc subgroups. Lastly, we investigate the approach's operating characteristics in a simple illustrative setting through a series of simulations, showing that it can have desirable properties under both null and alternative hypotheses., (Copyright © 2015 John Wiley & Sons, Ltd.)
- Published
- 2015
- Full Text
- View/download PDF
65. Ischemic Optic Neuropathies.
- Author
-
Rubin DB
- Subjects
- Humans, Optic Nerve blood supply, Optic Neuropathy, Ischemic
- Published
- 2015
- Full Text
- View/download PDF
66. Individual privacy versus public good: protecting confidentiality in health research.
- Author
-
O'Keefe CM and Rubin DB
- Subjects
- Australia, Biomedical Research methods, Biomedical Research statistics & numerical data, Computer Security legislation & jurisprudence, Computer Security standards, Computer Security statistics & numerical data, Confidentiality standards, European Union, Evidence-Based Medicine methods, Evidence-Based Medicine statistics & numerical data, Health Insurance Portability and Accountability Act, Humans, United States, Biomedical Research legislation & jurisprudence, Confidentiality legislation & jurisprudence, Data Interpretation, Statistical, Evidence-Based Medicine legislation & jurisprudence, Health Policy legislation & jurisprudence
- Abstract
Health and medical data are increasingly being generated, collected, and stored in electronic form in healthcare facilities and administrative agencies. Such data hold a wealth of information vital to effective health policy development and evaluation, as well as to enhanced clinical care through evidence-based practice and safety and quality monitoring. These initiatives are aimed at improving individuals' health and well-being. Nevertheless, analyses of health data archives must be conducted in such a way that individuals' privacy is not compromised. One important aspect of protecting individuals' privacy is protecting the confidentiality of their data. It is the purpose of this paper to provide a review of a number of approaches to reducing disclosure risk when making data available for research, and to present a taxonomy for such approaches. Some of these methods are widely used, whereas others are still in development. It is important to have a range of methods available because there is also a range of data-use scenarios, and it is important to be able to choose between methods suited to differing scenarios. In practice, it is necessary to find a balance between allowing the use of health and medical data for research and protecting confidentiality. This balance is often presented as a trade-off between disclosure risk and data utility, because methods that reduce disclosure risk, in general, also reduce data utility., (Copyright © 2015 John Wiley & Sons, Ltd.)
- Published
- 2015
- Full Text
- View/download PDF
67. Inhaled β-agonist does not modify sympathetic activity in patients with COPD.
- Author
-
Haarmann H, Mohrlang C, Tschiesner U, Rubin DB, Bornemann T, Rüter K, Bonev S, Raupach T, Hasenfuß G, and Andreas S
- Subjects
- Administration, Inhalation, Adrenergic beta-Agonists therapeutic use, Aged, Baroreflex drug effects, Blood Pressure drug effects, Brain-Derived Neurotrophic Factor blood, Brain-Derived Neurotrophic Factor drug effects, Catecholamines blood, Epinephrine blood, Female, Heart Rate drug effects, Humans, Lung drug effects, Male, Middle Aged, Norepinephrine blood, Peroneal Nerve drug effects, Pulse Wave Analysis, Respiratory Function Tests, Respiratory Rate drug effects, Salmeterol Xinafoate therapeutic use, Single-Blind Method, Adrenergic beta-Agonists pharmacology, Pulmonary Disease, Chronic Obstructive drug therapy, Salmeterol Xinafoate pharmacology, Sympathetic Nervous System drug effects
- Abstract
Background: Neurohumoral activation is present in COPD and might provide a link between pulmonary and systemic effects, especially cardiovascular disease. Because long acting inhaled β-agonists reduce hyperinflation, they could reduce sympathoexcitation by improving the inflation reflex. We aimed to evaluate if inhaled therapy with salmeterol reduces muscle sympathetic nerve activity (MSNA) evaluated by microneurography., Methods: MSNA, heart rate, blood pressure, and respiration were continually measured. After baseline recording of 20 minutes, placebo was administered; after further 45 minutes salmeterol (50 μg) was administered which was followed by a further 45 minutes of data recording. Additionally, lung function, plasma catecholamine levels, arterial pulse wave velocity, heart rate variability, and baroreflex sensitivity were evaluated. Following 4 weeks of treatment with salmeterol 50 μg twice daily, measurements were repeated without placebo administration., Results: A total of 32 COPD patients were included. Valid MSNA signals were obtained from 18 patients. Change in MSNA (bursts/100 heart beats) following acute administration of salmeterol did not differ significantly from the change following placebo (-1.96 ± 9.81 vs. -0.65 ± 9.07; p = 0.51) although hyperinflation was significantly reduced. Likewise, no changes in MSNA or catecholamines were observed after 4 weeks. Heart rate increased significantly by 3.8 ± 4.2 (p < 0.01) acutely and 3.9 ± 4.3 bpm (p < 0.01) after 4 weeks. Salmeterol treatment was safe and well tolerated., Conclusions: By using microneurography as a gold standard to evaluate sympathetic activity we found no change in MSNA following salmeterol inhalation. Thus, despite an attenuation of hyperinflation, the long acting β-agonist salmeterol does not appear to reduce nor incite sympathoexcitation., Trial Registration: This study was registered with the European Clinical Trials Database (EudraCT No. 2011-001581-18) and ClinicalTrials.gov ( NCT01536587 ).
- Published
- 2015
- Full Text
- View/download PDF
68. Tax-exempt hospitals and community benefit: new directions in policy and practice.
- Author
-
Rubin DB, Singh SR, and Young GJ
- Subjects
- Cost-Benefit Analysis, Humans, United States, Community-Institutional Relations economics, Health Policy, Hospitals, Voluntary economics, Hospitals, Voluntary organization & administration, Tax Exemption economics, Tax Exemption legislation & jurisprudence
- Abstract
The current community benefit standard for nonprofit hospital tax exemption has been the subject of mounting criticism. Many different constituencies have advanced the view that in its present form it fails to ensure that nonprofit hospitals provide adequate benefits to their communities in exchange for their tax exemption. In contrast, hospitals have often expressed the concern that the community benefit standard in its current form is vague and therefore difficult to comply with. Various suggestions have been made regarding how the existing community benefit standard could be improved or even replaced. In this article, we first discuss the historical and legal development of the community benefit standard. We then present the key controversies that have emerged in recent years and the policy responses attempted thus far. Finally, we evaluate possible future policy directions, which reform efforts could follow.
- Published
- 2015
- Full Text
- View/download PDF
69. The stabilized supralinear network: a unifying circuit motif underlying multi-input integration in sensory cortex.
- Author
-
Rubin DB, Van Hooser SD, and Miller KD
- Subjects
- Animals, Ferrets, Linear Models, Models, Neurological, Neural Inhibition physiology, Nonlinear Dynamics, Photic Stimulation, Auditory Cortex physiology, Feedback, Physiological physiology, Neurons physiology, Olfactory Cortex physiology, Somatosensory Cortex physiology, Visual Cortex physiology
- Abstract
Neurons in sensory cortex integrate multiple influences to parse objects and support perception. Across multiple cortical areas, integration is characterized by two neuronal response properties: (1) surround suppression--modulatory contextual stimuli suppress responses to driving stimuli; and (2) "normalization"--responses to multiple driving stimuli add sublinearly. These depend on input strength: for weak driving stimuli, contextual influences facilitate or more weakly suppress and summation becomes linear or supralinear. Understanding the circuit operations underlying integration is critical to understanding cortical function and disease. We present a simple, general theory. A wealth of integrative properties, including the above, emerge robustly from four cortical circuit properties: (1) supralinear neuronal input/output functions; (2) sufficiently strong recurrent excitation; (3) feedback inhibition; and (4) simple spatial properties of intracortical connections. Integrative properties emerge dynamically as circuit properties, with excitatory and inhibitory neurons showing similar behaviors. In new recordings in visual cortex, we confirm key model predictions., (Copyright © 2015 Elsevier Inc. All rights reserved.)
- Published
- 2015
- Full Text
- View/download PDF
70. Rerandomization to Balance Tiers of Covariates.
- Author
-
Morgan KL and Rubin DB
- Abstract
When conducting a randomized experiment, if an allocation yields treatment groups that differ meaningfully with respect to relevant covariates, groups should be rerandomized. The process involves specifying an explicit criterion for whether an allocation is acceptable, based on a measure of covariate balance, and rerandomizing units until an acceptable allocation is obtained. Here we illustrate how rerandomization could have improved the design of an already conducted randomized experiment on vocabulary and mathematics training programs, then provide a rerandomization procedure for covariates that vary in importance, and finally offer other extensions for rerandomization, including methods addressing computational efficiency. When covariates vary in a priori importance, better balance should be required for more important covariates. Rerandomization based on Mahalanobis distance preserves the joint distribution of covariates, but balances all covariates equally. Here we propose rerandomizing based on Mahalanobis distance within tiers of covariate importance. Because balancing covariates in one tier will in general also partially balance covariates in other tiers, for each subsequent tier we explicitly balance only the components orthogonal to covariates in more important tiers.
- Published
- 2015
- Full Text
- View/download PDF
71. Long-acting bronchodilators and arterial stiffness in patients with COPD: a comparison of fluticasone furoate/vilanterol with tiotropium.
- Author
-
Pepin JL, Cockcroft JR, Midwinter D, Sharma S, Rubin DB, and Andreas S
- Subjects
- Administration, Inhalation, Aged, Androstadienes adverse effects, Benzyl Alcohols adverse effects, Bronchodilator Agents administration & dosage, Bronchodilator Agents adverse effects, Chlorobenzenes adverse effects, Delayed-Action Preparations administration & dosage, Delayed-Action Preparations adverse effects, Dose-Response Relationship, Drug, Double-Blind Method, Drug Administration Schedule, Female, Follow-Up Studies, Humans, Male, Middle Aged, Pulse Wave Analysis methods, Respiratory Function Tests, Scopolamine Derivatives adverse effects, Severity of Illness Index, Tiotropium Bromide, Treatment Outcome, Androstadienes administration & dosage, Benzyl Alcohols administration & dosage, Chlorobenzenes administration & dosage, Pulmonary Disease, Chronic Obstructive diagnosis, Pulmonary Disease, Chronic Obstructive drug therapy, Scopolamine Derivatives administration & dosage, Vascular Stiffness drug effects
- Abstract
Background: Increased arterial stiffness as measured by aortic pulse wave velocity (aPWV) predicts cardiovascular events and mortality and is elevated in patients with COPD. Prior investigation suggests that a long-acting β-agonist (LABA)/inhaled corticosteroid (ICS) lowers aPWV in patients with baseline aPWV ≥ 11 m/s. This study compared the effect of the ICS/LABA fluticasone furoate/vilanterol (FF/VI), 100/25 μg, delivered via the ELLIPTA dry powder inhaler, with tiotropium bromide (TIO), 18 μg, on aPWV., Methods: This multicenter, randomized, blinded, double-dummy, parallel-group, 12-week study compared FF/VI and TIO, both administered once daily. The primary end point was aPWV change from baseline at 12 weeks. Safety end points included adverse events (AEs), vital signs, and clinical laboratory tests., Results: Two hundred fifty-seven patients with COPD and aPWV ≥ 11 m/s were randomized; 87% had prior cardiovascular events and/or risk. The mean difference in aPWV between FF/VI and TIO at week 12 was not significant (P = .484). Because the study did not contain a placebo arm, a post hoc analysis was performed to show that both treatments lowered aPWV by an approximate difference of 1 m/s compared with baseline. The proportion of patients reporting AEs was similar with FF/VI (24%) and TIO (18%). There were no changes in clinical concern for vital signs or clinical laboratory tests., Conclusions: No differences on aPWV were observed between FF/VI and TIO. However, further studies with a placebo arm are required to establish definitively whether long-acting bronchodilators lower aPWV. Both treatments demonstrated an acceptable tolerability profile., Trial Registry: ClinicalTrials.gov; No.: NCT01395888; URL: www.clinicaltrials.gov.
- Published
- 2014
- Full Text
- View/download PDF
72. Sensitivity analysis for a partially missing binary outcome in a two-arm randomized clinical trial.
- Author
-
Liublinska V and Rubin DB
- Subjects
- Computer Simulation, Fractures, Compression surgery, Humans, Kyphoplasty adverse effects, Kyphoplasty standards, Pain prevention & control, Spinal Fractures surgery, United States, Data Interpretation, Statistical, Models, Statistical, Randomized Controlled Trials as Topic methods
- Abstract
Although recent guidelines for dealing with missing data emphasize the need for sensitivity analyses, and such analyses have a long history in statistics, universal recommendations for conducting and displaying these analyses are scarce. We propose graphical displays that help formalize and visualize the results of sensitivity analyses, building upon the idea of 'tipping-point' analysis for randomized experiments with a binary outcome and a dichotomous treatment. The resulting 'enhanced tipping-point displays' are convenient summaries of conclusions obtained from making different modeling assumptions about missingness mechanisms. The primary goal of the displays is to make formal sensitivity analysesmore comprehensible to practitioners, thereby helping them assess the robustness of the experiment's conclusions to plausible missingness mechanisms. We also present a recent example of these enhanced displays in amedical device clinical trial that helped lead to FDA approval., (Copyright © 2014 John Wiley & Sons, Ltd.)
- Published
- 2014
- Full Text
- View/download PDF
73. A hierarchical finite mixture model that accommodates zero-inflated counts, non-independence, and heterogeneity.
- Author
-
Morgan CJ, Lenzenweger MF, Rubin DB, and Levy DL
- Subjects
- Adult, Endophenotypes, Humans, Middle Aged, Odds Ratio, Schizophrenia genetics, Datasets as Topic statistics & numerical data, Models, Statistical, Poisson Distribution
- Abstract
A number of mixture modeling approaches assume both normality and independent observations. However, these two assumptions are at odds with the reality of many data sets, which are often characterized by an abundance of zero-valued or highly skewed observations as well as observations from biologically related (i.e., non-independent) subjects. We present here a finite mixture model with a zero-inflated Poisson regression component that may be applied to both types of data. This flexible approach allows the use of covariates to model both the Poisson mean and rate of zero inflation and can incorporate random effects to accommodate non-independent observations. We demonstrate the utility of this approach by applying these models to a candidate endophenotype for schizophrenia, but the same methods are applicable to other types of data characterized by zero inflation and non-independence., (Copyright © 2014 John Wiley & Sons, Ltd.)
- Published
- 2014
- Full Text
- View/download PDF
74. Analysis of the stabilized supralinear network.
- Author
-
Ahmadian Y, Rubin DB, and Miller KD
- Subjects
- Feedback, Physiological, Humans, Neural Inhibition physiology, Nonlinear Dynamics, Visual Cortex physiology, Models, Neurological, Nerve Net physiology, Neural Networks, Computer, Neurons physiology, Visual Cortex cytology
- Abstract
We study a rate-model neural network composed of excitatory and inhibitory neurons in which neuronal input-output functions are power laws with a power greater than 1, as observed in primary visual cortex. This supralinear input-output function leads to supralinear summation of network responses to multiple inputs for weak inputs. We show that for stronger inputs, which would drive the excitatory subnetwork to instability, the network will dynamically stabilize provided feedback inhibition is sufficiently strong. For a wide range of network and stimulus parameters, this dynamic stabilization yields a transition from supralinear to sublinear summation of network responses to multiple inputs. We compare this to the dynamic stabilization in the balanced network, which yields only linear behavior. We more exhaustively analyze the two-dimensional case of one excitatory and one inhibitory population. We show that in this case, dynamic stabilization will occur whenever the determinant of the weight matrix is positive and the inhibitory time constant is sufficiently small, and analyze the conditions for supersaturation, or decrease of firing rates with increasing stimulus contrast (which represents increasing input firing rates). In work to be presented elsewhere, we have found that this transition from supralinear to sublinear summation can explain a wide variety of nonlinearities in cerebral cortical processing.
- Published
- 2013
- Full Text
- View/download PDF
75. Robust estimation of causal effects of binary treatments in unconfounded studies with dichotomous outcomes.
- Author
-
Gutman R and Rubin DB
- Subjects
- Computer Simulation, Humans, Data Interpretation, Statistical, Models, Statistical, Treatment Outcome
- Abstract
The estimation of causal effects has been the subject of extensive research. In unconfounded studies with a dichotomous outcome, Y, Cangul, Chretien, Gutman and Rubin (2009) demonstrated that logistic regression for a scalar continuous covariate X is generally statistically invalid for testing null treatment effects when the distributions of X in the treated and control populations differ and the logistic model for Y given X is misspecified. In addition, they showed that an approximately valid statistical test can be generally obtained by discretizing X followed by regression adjustment within each interval defined by the discretized X. This paper extends the work of Cangul et al. 2009 in three major directions. First, we consider additional estimation procedures, including a new one that is based on two independent splines and multiple imputation; second, we consider additional distributional factors; and third, we examine the performance of the procedures when the treatment effect is non-null. Of all the methods considered and in most of the experimental conditions that were examined, our proposed new methodology appears to work best in terms of point and interval estimation., (Copyright © 2012 John Wiley & Sons, Ltd.)
- Published
- 2013
- Full Text
- View/download PDF
76. A model of electrophysiological heterogeneity in periglomerular cells.
- Author
-
Sethupathy P, Rubin DB, Li G, and Cleland TA
- Abstract
Olfactory bulb (OB) periglomerular (PG) cells are heterogeneous with respect to several features, including morphology, connectivity, patterns of protein expression, and electrophysiological properties. However, these features rarely correlate with one another, suggesting that the differentiating properties of PG cells may arise from multiple independent adaptive variables rather than representing discrete cell classes. We use computational modeling to assess this hypothesis with respect to electrophysiological properties. Specifically, we show that the heterogeneous electrophysiological properties demonstrated in PG cell recordings can be explained solely by differences in the relative expression levels of ion channel species in the cell, without recourse to modifying channel kinetic properties themselves. This PG cell model can therefore be used as the basis for diverse cellular and network-level analyses of OB computations. Moreover, this simple basis for heterogeneity contributes to an emerging hypothesis that glomerular-layer interneurons may be better described as a single population expressing distributions of partially independent, potentially plastic properties, rather than as a set of discrete cell classes.
- Published
- 2013
- Full Text
- View/download PDF
77. Which patients with chronic obstructive pulmonary disease benefit from the addition of an inhaled corticosteroid to their bronchodilator? A cluster analysis.
- Author
-
Disantostefano RL, Li H, Rubin DB, and Stempel DA
- Abstract
Objective: To identify subsets of chronic obstructive pulmonary disease (COPD) patients who are more protected from exacerbations with the use of an inhaled corticosteroid/long-acting β2 agonist (ICS/LABA) combination, compared with the use of LABA monotherapy., Design: Post hoc cluster analysis of patients from two randomised clinical trials of salmeterol/fluticasone propionate (SFC) and salmeterol (SAL) that had primary endpoints of moderate/severe exacerbation rates., Setting: Centres in North America., Participants: 1543 COPD patients were studied., Interventions: SFC 50/250 µg or SAL 50 µg, twice daily., Primary and Secondary Outcome Measures: The analysis identified clusters of COPD patients more responsive to SFC versus SAL with respect to the annual rate of moderate/severe exacerbations and compared their baseline clinical characteristics., Results: Overall, SFC significantly reduced the annual rate of moderate/severe exacerbations as compared with SAL alone (rate ratio (RR)=0.701, p<0.001). Three-patient clusters were identified: COPD patients receiving diuretics (RR=0.56, p<0.001); patients not receiving diuretics but with forced expiratory volume in 1 s (FEV1) reversibility ≥12% (RR=0.67, p<0.001) exhibited a substantial reduction in the annual rate of moderate/severe exacerbations relative to SAL. A third cluster, consisting of patients not receiving diuretics and without FEV1 reversibility, demonstrated no difference for SFC versus SAL. Patients receiving diuretics had a significantly higher prevalence of comorbid cardiovascular disease., Conclusions: COPD patients receiving diuretics and those not receiving diuretics but with FEV1 reversibility >12% at baseline were significantly more likely to experience a reduction in COPD-associated exacerbations with SFC versus SAL alone., Trial Registration: NCT00115492, NCT00144911.
- Published
- 2013
- Full Text
- View/download PDF
78. Evaluating hospitals' provision of community benefit: an argument for an outcome-based approach to nonprofit hospital tax exemption.
- Author
-
Rubin DB, Singh SR, and Jacobson PD
- Subjects
- Community-Institutional Relations, Health Policy, Humans, Outcome Assessment, Health Care, Uncompensated Care, United States, Hospitals, Voluntary economics, Organizations, Nonprofit economics, Tax Exemption economics, Taxes economics
- Abstract
Nonprofit hospitals are exempt from federal income taxation if they pass organizational and operational tests, including satisfying the community-benefit standard. Policymakers, however, have questioned the adequacy of the community benefits that nonprofit hospitals provide in exchange for these exemptions. The Internal Revenue Service recently responded to these concerns by redesigning its tax forms for nonprofit hospitals. The new Form 990 Schedule H requires nonprofit hospitals to provide additional information about their community-benefit activities. This new reporting requirement, however, places an undue focus on input-based community-benefit indicators, in particular expenditures. We argue that expanding the current input-based reporting requirement to include not only monetary inputs but also population health outcomes would achieve greater benefit for society.
- Published
- 2013
- Full Text
- View/download PDF
79. When trade law meets public health evidence: the World Trade Organization and clove cigarettes.
- Author
-
Jarman H, Schmidt J, and Rubin DB
- Subjects
- Humans, Indonesia, International Agencies, International Cooperation legislation & jurisprudence, Public Health economics, United States, Commerce legislation & jurisprudence, Public Health legislation & jurisprudence, Syzygium chemistry, Tobacco Products economics
- Abstract
A recent trade dispute between the USA and Indonesia, overseen by the World Trade Organization, challenges piecemeal approaches to tobacco regulation.
- Published
- 2012
- Full Text
- View/download PDF
80. Potential updates to Cornfield's 1959 'principles of research'.
- Author
-
Rubin DB
- Subjects
- Biomedical Research
- Published
- 2012
- Full Text
- View/download PDF
81. Analyses that inform policy decisions.
- Author
-
Gutman R and Rubin DB
- Subjects
- Humans, Bayes Theorem, Uncertainty
- Published
- 2012
- Full Text
- View/download PDF
82. Re: "dealing with missing outcome data in randomized trials and observational studies".
- Author
-
Liublinska V and Rubin DB
- Subjects
- Humans, Models, Statistical, Outcome Assessment, Health Care, Randomized Controlled Trials as Topic
- Published
- 2012
- Full Text
- View/download PDF
83. Statistical issues and limitations in personalized medicine research with clinical trials.
- Author
-
Rubin DB and van der Laan MJ
- Subjects
- Anti-Bacterial Agents therapeutic use, Bacterial Infections drug therapy, Confidence Intervals, Humans, Skin Diseases drug therapy, United States, United States Food and Drug Administration, Biomedical Research, Data Interpretation, Statistical, Precision Medicine, Randomized Controlled Trials as Topic
- Abstract
We discuss using clinical trial data to construct and evaluate rules that use baseline covariates to assign different treatments to different patients. Given such a candidate personalization rule, we first note that its performance can often be evaluated without actually applying the rule to subjects, and a class of estimators is characterized from a statistical efficiency standpoint. We also point out a recently noted reduction of the rule construction problem to a classification task and extend results in this direction. Together these facts suggest a natural form of cross-validation in which a personalized medicine rule can be constructed from clinical trial data using standard classification tools and then evaluated in a replicated trial. Because replication is often required by the FDA to provide evidence of safety and efficacy before pharmaceutical drugs can be marketed, there are abundant data with which to explore the potential benefits of more tailored therapy. We constructed and evaluated personalized medicine rules using simulations based on two active-controlled randomized clinical trials of antibacterial drugs for the treatment of skin and skin structure infections. Unfortunately we present negative results that did not suggest benefit from personalization. We discuss the implications of this finding and why statistical approaches to personalized medicine problems will often face difficult challenges.
- Published
- 2012
- Full Text
- View/download PDF
84. A calibrated multiclass extension of AdaBoost.
- Author
-
Rubin DB
- Subjects
- Computer Simulation, Databases, Factual, Humans, Predictive Value of Tests, Sensitivity and Specificity, Algorithms, Computational Biology methods, Data Mining methods
- Abstract
AdaBoost is a popular and successful data mining technique for binary classification. However, there is no universally agreed upon extension of the method for problems with more than two classes. Most multiclass generalizations simply reduce the problem to a series of binary classification problems. The statistical interpretation of AdaBoost is that it operates through loss-based estimation: by using an exponential loss function as a surrogate for misclassification loss, it sequentially minimizes empirical risk through fitting a base classifier to iteratively reweighted training data. While there are several extensions using loss-based estimation with multiclass base classifiers, these use multiclass versions of the exponential loss that are not classification calibrated: unless restrictions are placed on conditional class probabilities, it becomes possible to have optimal surrogate risk but poor misclassification risk. In this work, we introduce a new AdaBoost extension called AdaBoost. SL that does not reduce the problem into binary subproblems and that uses a classification-calibrated multiclass exponential loss function. Numerical experiments show the algorithm performs well on benchmark datasets.
- Published
- 2011
- Full Text
- View/download PDF
85. [Propensity score methods for creating covariate balance in observational studies].
- Author
-
Pattanayak CW, Rubin DB, and Zell ER
- Subjects
- Analysis of Variance, Antifibrinolytic Agents therapeutic use, Aprotinin therapeutic use, Causality, Confounding Factors, Epidemiologic, Data Interpretation, Statistical, Hemostatics therapeutic use, Humans, Random Allocation, Regression Analysis, Tranexamic Acid therapeutic use, Treatment Outcome, Propensity Score, Randomized Controlled Trials as Topic statistics & numerical data
- Abstract
Randomization of treatment assignment in experiments generates treatment groups with approximately balanced baseline covariates. However, in observational studies, where treatment assignment is not random, patients in the active treatment and control groups often differ on crucial covariates that are related to outcomes. These covariate imbalances can lead to biased treatment effect estimates. The propensity score is the probability that a patient with particular baseline characteristics is assigned to active treatment rather than control. Though propensity scores are unknown in observational studies, by matching or subclassifying patients on estimated propensity scores, we can design observational studies that parallel randomized experiments, with approximate balance on observed covariates. Observational study designs based on estimated propensity scores can generate approximately unbiased treatment effect estimates. Critically, propensity score designs should be created without access to outcomes, mirroring the separation of study design and outcome analysis in randomized experiments. This paper describes the potential outcomes framework for causal inference and best practices for designing observational studies with propensity scores. We discuss the use of propensity scores in two studies assessing the effectiveness and risks of antifibrinolytic drugs during cardiac surgery., (Published by Elsevier Espana.)
- Published
- 2011
- Full Text
- View/download PDF
86. Effect of fluticasone propionate/salmeterol on arterial stiffness in patients with COPD.
- Author
-
Dransfield MT, Cockcroft JR, Townsend RR, Coxson HO, Sharma SS, Rubin DB, Emmett AH, Cicale MJ, Crater GD, and Martinez FJ
- Subjects
- Albuterol adverse effects, Albuterol pharmacology, Androstadienes pharmacology, Arteries drug effects, Blood Flow Velocity drug effects, Cardiovascular Diseases chemically induced, Double-Blind Method, Drug Combinations, Female, Fluticasone-Salmeterol Drug Combination, Humans, Male, Middle Aged, Pulmonary Disease, Chronic Obstructive complications, Pulmonary Disease, Chronic Obstructive drug therapy, Sympathomimetics pharmacology, Treatment Outcome, Albuterol analogs & derivatives, Androstadienes adverse effects, Arteries physiopathology, Cardiovascular Diseases physiopathology, Forced Expiratory Volume drug effects, Pulmonary Disease, Chronic Obstructive physiopathology, Sympathomimetics adverse effects, Vascular Resistance drug effects
- Abstract
Background: COPD is associated with increased arterial stiffness which may in part explain the cardiovascular morbidity observed in the disease. A causal relationship between arterial stiffness and cardiovascular events has not been established, though their strong association raises the possibility that therapies that reduce arterial stiffness may improve cardiovascular outcomes. Prior studies suggest that fluticasone propionate/salmeterol (FSC) may improve cardiovascular outcomes in COPD and we hypothesized that FSC would reduce arterial stiffness in these patients., Methods: This multicenter, randomized, double-blind, placebo-controlled study compared the effects of FSC 250/50 μg twice-daily and placebo on aortic pulse wave velocity (aPWV) as determined by ECG-gated carotid and femoral artery waveforms. The primary endpoint was aPWV change from baseline at 12-weeks (last measure for each patient)., Results: 249 patients were randomized; the mean FEV(1) in each group was similar (55% predicted) and 60% of patients reported a cardiovascular disorder. At 12-weeks, aPWV between FSC and placebo was -0.42 m/s (95%CI -0.88, 0.03; p = 0.065). A statistically significant reduction in aPWV between FSC and placebo was observed in those who remained on study drug throughout the treatment period [-0.49 m/s (95%CI -0.98, -0.01; p = 0.045)]. A post hoc analysis suggested the effect of FSC was greater in patients with higher baseline aPWV., Conclusion: FSC does not reduce aPWV in all patients with moderate to severe COPD, but may have effects in those with elevated arterial stiffness. Additional studies are required to determine if aPWV could serve as a surrogate for cardiovascular events in COPD., (Copyright © 2011 Elsevier Ltd. All rights reserved.)
- Published
- 2011
- Full Text
- View/download PDF
87. A role for moral vision in public health.
- Author
-
Rubin DB
- Subjects
- Bioethics, Humans, Morals, Public Health
- Published
- 2010
- Full Text
- View/download PDF
88. On the limitations of comparative effectiveness research.
- Author
-
Rubin DB
- Subjects
- Bias, Comparative Effectiveness Research methods, Data Interpretation, Statistical, Decision Making, Evidence-Based Medicine methods, Evidence-Based Medicine standards, Humans, Research Design, United States, Comparative Effectiveness Research standards, Health Policy
- Published
- 2010
- Full Text
- View/download PDF
89. Reflections stimulated by the comments of Shadish (2010) and West and Thoemmes (2010).
- Author
-
Rubin DB
- Subjects
- Humans, Causality, Psychological Theory, Psychology methods
- Abstract
This article offers reflections on the development of the Rubin causal model (RCM), which were stimulated by the impressive discussions of the RCM and Campbell's superb contributions to the practical problems of drawing causal inferences written by Will Shadish (2010) and Steve West and Felix Thoemmes (2010). It is not a rejoinder in any real sense but more of a sequence of clarifications of parts of the RCM combined with some possibly interesting personal historical comments, which I do not think can be found elsewhere. Of particular interest in the technical content, I think, are the extended discussions of the stable unit treatment value assumption, the explication of the variety of definitions of causal estimands, and the discussion of the assignment mechanism.
- Published
- 2010
- Full Text
- View/download PDF
90. Propensity score methods.
- Author
-
Rubin DB
- Subjects
- Humans, Models, Statistical, Ophthalmology statistics & numerical data, Probability
- Published
- 2010
- Full Text
- View/download PDF
91. A small sample correction for estimating attributable risk in case-control studies.
- Author
-
Rubin DB
- Subjects
- Female, Humans, Male, Models, Statistical, Odds Ratio, Sensitivity and Specificity, Case-Control Studies, Epidemiologic Methods, Risk Assessment, Sample Size
- Abstract
The attributable risk, often called the population attributable risk, is in many epidemiological contexts a more relevant measure of exposure-disease association than the excess risk, relative risk, or odds ratio. When estimating attributable risk with case-control data and a rare disease, we present a simple bias correction to the standard approach, which also makes it more stable and less variable. As with analogous corrections given by Jewell (1986) for other measures of association, the adjustment often won't make a substantial difference unless the sample size is very small or point estimates are desired within fine strata, but we discuss the possible utility for applications.
- Published
- 2010
- Full Text
- View/download PDF
92. Teaching and learning moments. A lesson in ethics.
- Author
-
Rubin DB
- Subjects
- Child, Empathy, Humans, Male, Asphyxia complications, Cerebral Palsy complications, Ethics, Medical, Hypoxia, Brain etiology, Intensive Care Units, Pediatric ethics
- Published
- 2009
- Full Text
- View/download PDF
93. Testing treatment effects in unconfounded studies under model misspecification: logistic regression, discretization, and their combination.
- Author
-
Cangul MZ, Chretien YR, Gutman R, and Rubin DB
- Subjects
- Algorithms, Analysis of Variance, Computer Simulation, Epidemiologic Research Design, Humans, Reproducibility of Results, Statistical Distributions, Clinical Trials as Topic methods, Logistic Models, Models, Statistical, Treatment Outcome
- Abstract
Logistic regression is commonly used to test for treatment effects in observational studies. If the distribution of a continuous covariate differs between treated and control populations, logistic regression yields an invalid hypothesis test even in an uncounfounded study if the link is not logistic. This flaw is not corrected by the commonly used technique of discretizing the covariate into intervals. A valid test can be obtained by discretization followed by regression adjustment within each interval.
- Published
- 2009
- Full Text
- View/download PDF
94. Making all the children above average: ethical and regulatory concerns for pediatricians in pediatric enhancement research.
- Author
-
Berg JW, Mehlman MJ, Rubin DB, and Kodish E
- Subjects
- Child, Humans, United States, Biomedical Enhancement ethics, Biomedical Research ethics, Biomedical Research legislation & jurisprudence, Nontherapeutic Human Experimentation ethics, Nontherapeutic Human Experimentation legislation & jurisprudence, Pediatrics
- Abstract
Building on the knowledge generated by the long history of disease-oriented research, the next few decades will witness an explosion of biomedical enhancements to make people faster, stronger, smarter, less forgetful, happier, prettier, and live longer. Growing interest in pediatric enhancements is likely to stimulate the conduct of enhancement research involving children. However, guidelines for the protection of human subjects were developed for investigations of therapeutic modalities. To date, virtually no attention has been paid to whether these rules would be appropriate for investigations to establish the safety and efficacy of technologies intended for enhancement rather than therapeutic uses and, if not, whether ethically acceptable rules could be designed. This article discusses whether the current guidelines for pediatric research provide appropriate protections for pediatric subjects in enhancement research and considers what additional protections might be necessary.
- Published
- 2009
- Full Text
- View/download PDF
95. Variation in the large-scale organization of gene expression levels in the hippocampus relates to stable epigenetic variability in behavior.
- Author
-
Alter MD, Rubin DB, Ramsey K, Halpern R, Stephan DA, Abbott LF, and Hen R
- Subjects
- Animals, Female, Gene Expression Profiling methods, Mice, Mice, Inbred BALB C, Oligonucleotide Array Sequence Analysis, Behavior, Animal, Epigenesis, Genetic, Gene Expression, Genetic Variation, Hippocampus metabolism
- Abstract
Background: Despite sharing the same genes, identical twins demonstrate substantial variability in behavioral traits and in their risk for disease. Epigenetic factors-DNA and chromatin modifications that affect levels of gene expression without affecting the DNA sequence-are thought to be important in establishing this variability. Epigenetically-mediated differences in the levels of gene expression that are associated with individual variability traditionally are thought to occur only in a gene-specific manner. We challenge this idea by exploring the large-scale organizational patterns of gene expression in an epigenetic model of behavioral variability., Methodology/findings: To study the effects of epigenetic influences on behavioral variability, we examine gene expression in genetically identical mice. Using a novel approach to microarray analysis, we show that variability in the large-scale organization of gene expression levels, rather than differences in the expression levels of specific genes, is associated with individual differences in behavior. Specifically, increased activity in the open field is associated with increased variance of log-transformed measures of gene expression in the hippocampus, a brain region involved in open field activity. Early life experience that increases adult activity in the open field also similarly modifies the variance of gene expression levels. The same association of the variance of gene expression levels with behavioral variability is found with levels of gene expression in the hippocampus of genetically heterogeneous outbred populations of mice, suggesting that variation in the large-scale organization of gene expression levels may also be relevant to phenotypic differences in outbred populations such as humans. We find that the increased variance in gene expression levels is attributable to an increasing separation of several large, log-normally distributed families of gene expression levels. We also show that the presence of these multiple log-normal distributions of gene expression levels is a universal characteristic of gene expression in eurkaryotes. We use data from the MicroArray Quality Control Project (MAQC) to demonstrate that our method is robust and that it reliably detects biological differences in the large-scale organization of gene expression levels., Conclusions: Our results contrast with the traditional belief that epigenetic effects on gene expression occur only at the level of specific genes and suggest instead that the large-scale organization of gene expression levels provides important insights into the relationship of gene expression with behavioral variability. Understanding the epigenetic, genetic, and environmental factors that regulate the large-scale organization of gene expression levels, and how changes in this large-scale organization influences brain development and behavior will be a major future challenge in the field of behavioral genomics.
- Published
- 2008
- Full Text
- View/download PDF
96. Empirical efficiency maximization: improved locally efficient covariate adjustment in randomized experiments and survival analysis.
- Author
-
Rubin DB and van der Laan MJ
- Subjects
- Analysis of Variance, Biostatistics methods, Humans, Likelihood Functions, Models, Statistical, Randomized Controlled Trials as Topic statistics & numerical data, Survival Analysis
- Abstract
It has long been recognized that covariate adjustment can increase precision in randomized experiments, even when it is not strictly necessary. Adjustment is often straightforward when a discrete covariate partitions the sample into a handful of strata, but becomes more involved with even a single continuous covariate such as age. As randomized experiments remain a gold standard for scientific inquiry, and the information age facilitates a massive collection of baseline information, the longstanding problem of if and how to adjust for covariates is likely to engage investigators for the foreseeable future. In the locally efficient estimation approach introduced for general coarsened data structures by James Robins and collaborators, one first fits a relatively small working model, often with maximum likelihood, giving a nuisance parameter fit in an estimating equation for the parameter of interest. The usual advertisement is that the estimator will be asymptotically efficient if the working model is correct, but otherwise will still be consistent and asymptotically Gaussian. However, by applying standard likelihood-based fits to misspecified working models in covariate adjustment problems, one can poorly estimate the parameter of interest. We propose a new method, empirical efficiency maximization, to optimize the working model fit for the resulting parameter estimate. In addition to the randomized experiment setting, we show how our covariate adjustment procedure can be used in survival analysis applications. Numerical asymptotic efficiency calculations demonstrate gains relative to standard locally efficient estimators
- Published
- 2008
97. Rejoinder to Tan.
- Author
-
Rubin DB and van der Laan MJ
- Abstract
We respond to several interesting points raised by Tan regarding our article.
- Published
- 2008
- Full Text
- View/download PDF
98. Principal stratification designs to estimate input data missing due to death.
- Author
-
Frangakis CE, Rubin DB, An MW, and MacKenzie E
- Subjects
- Bias, Humans, Proportional Hazards Models, Risk Assessment methods, Risk Factors, Sample Size, Survival Analysis, Survival Rate, Biometry methods, Critical Illness mortality, Data Interpretation, Statistical, Mortality, Outcome Assessment, Health Care methods, Patient Dropouts statistics & numerical data, Surveys and Questionnaires
- Abstract
We consider studies of cohorts of individuals after a critical event, such as an injury, with the following characteristics. First, the studies are designed to measure "input" variables, which describe the period before the critical event, and to characterize the distribution of the input variables in the cohort. Second, the studies are designed to measure "output" variables, primarily mortality after the critical event, and to characterize the predictive (conditional) distribution of mortality given the input variables in the cohort. Such studies often possess the complication that the input data are missing for those who die shortly after the critical event because the data collection takes place after the event. Standard methods of dealing with the missing inputs, such as imputation or weighting methods based on an assumption of ignorable missingness, are known to be generally invalid when the missingness of inputs is nonignorable, that is, when the distribution of the inputs is different between those who die and those who live. To address this issue, we propose a novel design that obtains and uses information on an additional key variable-a treatment or externally controlled variable, which if set at its "effective" level, could have prevented the death of those who died. We show that the new design can be used to draw valid inferences for the marginal distribution of inputs in the entire cohort, and for the conditional distribution of mortality given the inputs, also in the entire cohort, even under nonignorable missingness. The crucial framework that we use is principal stratification based on the potential outcomes, here mortality under both levels of treatment. We also show using illustrative preliminary injury data that our approach can reveal results that are more reasonable than the results of standard methods, in relatively dramatic ways. Thus, our approach suggests that the routine collection of data on variables that could be used as possible treatments in such studies of inputs and mortality should become common.
- Published
- 2007
- Full Text
- View/download PDF
99. Resolving the latent structure of schizophrenia endophenotypes using expectation-maximization-based finite mixture modeling.
- Author
-
Lenzenweger MF, McLachlan G, and Rubin DB
- Subjects
- Attention, Cognition Disorders epidemiology, Cognition Disorders genetics, Eye Movements physiology, Female, Humans, Male, Middle Aged, Schizophrenia epidemiology, Phenotype, Schizophrenia genetics
- Abstract
Prior research has focused on the latent structure of endophenotypic markers of schizophrenia liability, or schizotypy. The work supports the existence of 2 relatively distinct latent classes and derives largely from the taxometric analysis of psychometric values. The present study used finite mixture modeling as a technique for discerning latent structure and the laboratory-measured endophenotypes of sustained attention deficits and eye-tracking dysfunction as endophenotype indexes. In a large adult community sample (N=311), finite mixture analysis of the sustained attention index d' and 2 eye-tracking indexes (gain and catch-up saccade rate) revealed evidence for 2 latent components. A putative schizotypy class accounted for 27% of the sample. A supplementary maximum covariance taxometric analysis yielded highly consistent results. Subjects in the schizotypy component displayed higher rates of schizotypal personality features and an increased rate of treated schizophrenia in their 1st-degree biological relatives compared with subjects in the other component. Implications of these results are examined in light of major theories of schizophrenia liability, and methodological advantages of finite mixture modeling for psychopathology research, with particular emphasis on genomic issues, are discussed., ((c) 2007 APA, all rights reserved.)
- Published
- 2007
- Full Text
- View/download PDF
100. Diagnostics for confounding in PK/PD models for oxcarbazepine.
- Author
-
Nedelman JR, Rubin DB, and Sheiner LB
- Subjects
- Adult, Anticonvulsants blood, Carbamazepine blood, Carbamazepine pharmacokinetics, Carbamazepine pharmacology, Child, Dose-Response Relationship, Drug, Humans, Oxcarbazepine, Randomized Controlled Trials as Topic, Seizures drug therapy, Seizures metabolism, Anticonvulsants pharmacokinetics, Anticonvulsants pharmacology, Carbamazepine analogs & derivatives, Models, Biological
- Abstract
One type of pharmacokinetic/pharmacodynamic (PK/PD) relationship that is used to characterize the therapeutic action of a drug is the relationship between some univariate summary of the plasma-concentration-versus-time profile and the drug effect on a response outcome. Operationally, such a relationship may be observed in a large clinical trial where randomly sampled patients are randomized to different values of the concentration summary. If, under such conditions, the relationship between concentration and effect does not depend on the dose needed to attain the target concentration, such a relationship will be called a true PK/PD relationship. When the true PK/PD relationship is assessed as an object of estimation in a dose-controlled clinical trial (i.e. when dose is randomized), observed drug concentration is an outcome variable. The estimated PK/PD relationship between observed outcome and observed concentration, which we then refer to as the conventional PK/PD relationship, may be biased for the true PK/PD relationship. Because of this bias, the conventional relationship is called confounded for the true one. We show that diagnostics for confounding can be devised under reasonable assumptions. We then apply these diagnostics to PK/PD assessments of adults and children on oxcarbazepine adjunctive therapy. It was necessary to demonstrate the similarity of the true PK/PD relationships of adults and children on adjunctive therapy in order to support the approval of oxcarbazepine monotherapy in children by a bridging argument., (Copyright (c) 2006 John Wiley & Sons, Ltd.)
- Published
- 2007
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.