22 results on '"Matthew Blackwell"'
Search Results
2. Increased dosage and treatment time of Epigallocatechin-3-gallate (EGCG) negatively affects skeletal parameters in normal mice and Down syndrome mouse models
- Author
-
Raza Jamal, Jonathan LaCombe, Roshni Patel, Matthew Blackwell, Jared R. Thomas, Kourtney Sloan, Joseph M. Wallace, and Randall J. Roper
- Subjects
Medicine ,Science - Abstract
Bone abnormalities affect all individuals with Down syndrome (DS) and are linked to abnormal expression of DYRK1A, a gene found in three copies in people with DS and Ts65Dn DS model mice. Previous work in Ts65Dn male mice demonstrated that both genetic normalization of Dyrk1a and treatment with ~9 mg/kg/day Epigallocatechin-3-gallate (EGCG), the main polyphenol found in green tea and putative DYRK1A inhibitor, improved some skeletal deficits. Because EGCG treatment improved mostly trabecular skeletal deficits, we hypothesized that increasing EGCG treatment dosage and length of administration would positively affect both trabecular and cortical bone in Ts65Dn mice. Treatment of individuals with DS with green tea extract (GTE) containing EGCG also showed some weight loss in individuals with DS, and we hypothesized that weights would be affected in Ts65Dn mice after EGCG treatment. Treatment with ~20 mg/kg/day EGCG for seven weeks showed no improvements in male Ts65Dn trabecular bone and only limited improvements in cortical measures. Comparing skeletal analyses after ~20mg/kg/day EGCG treatment with previously published treatments with ~9, 50, and 200 mg/kg/day EGCG showed that increased dosage and treatment time increased cortical structural deficits leading to weaker appendicular bones in male mice. Weight was not affected by treatment in mice, except for those given a high dose of EGCG by oral gavage. These data indicate that high doses of EGCG, similar to those reported in some treatment studies of DS and other disorders, may impair long bone structure and strength. Skeletal phenotypes should be monitored when high doses of EGCG are administered therapeutically.
- Published
- 2022
3. Amelia II: A Program for Missing Data
- Author
-
James Honaker, Gary King, and Matthew Blackwell
- Subjects
missing data ,multiple imputation ,bootstrap ,Statistics ,HA1-4737 - Abstract
Amelia II is a complete R package for multiple imputation of missing data. The package implements a new expectation-maximization with bootstrapping algorithm that works faster, with larger numbers of variables, and is far easier to use, than various Markov chainMonte Carlo approaches, but gives essentially the same answers. The program also improves imputation models by allowing researchers to put Bayesian priors on individual cell values, thereby including a great deal of potentially valuable and extensive information. Italso includes features to accurately impute cross-sectional datasets, individual time series, or sets of time series for different cross-sections. A full set of graphical diagnostics are also available. The program is easy to use, and the simplicity of the algorithm makes itfar more robust; both a simple command line and extensive graphical user interface are included.
- Published
- 2011
4. Telescope Matching for Reducing Model Dependence in the Estimation of the Effects of Time-Varying Treatments: An Application to Negative Advertising
- Author
-
Matthew Blackwell and Anton Strezhnev
- Subjects
Statistics and Probability ,Estimation ,Telescope ,Economics and Econometrics ,Matching (statistics) ,law ,Negative advertising ,Statistics, Probability and Uncertainty ,Algorithm ,Social Sciences (miscellaneous) ,law.invention ,Mathematics - Abstract
Time-varying treatments are prevalent in the social sciences. For example, a political campaign might decide to air attack ads against an opponent, but this decision to go negative will impact polling and, thus, future campaign strategy. If an analyst naively applies methods for point exposures to estimate the effect of earlier treatments, this would lead to post-treatment bias. Several existing methods can adjust for this type of time-varying confounding, but they typically rely on strong modelling assumptions. In this paper, we propose a novel two-step matching procedure for estimating the effect of two-period treatments. This method, telescope matching, reduces model dependence without inducing post-treatment bias by using matching with replacement to impute missing counterfactual outcomes. It then employs flexible regression models to correct for bias induced by imperfect matches. We derive the asymptotic properties of the telescope matching estimator and provide a consistent estimator for its variance. We illustrate telescope matching by investigating the effect of negative campaigning in US Senate and gubernatorial elections. Using the method, we uncover a positive effect on turnout of negative ads early in a campaign and a negative effect of early negativity on vote shares.
- Published
- 2021
- Full Text
- View/download PDF
5. Increased dosage and treatment time of Epigallocatechin-3-gallate (EGCG) negatively affects skeletal parameters in normal mice and Down syndrome mouse models
- Author
-
Raza Jamal, Jonathan LaCombe, Roshni Patel, Matthew Blackwell, Jared R. Thomas, Kourtney Sloan, Joseph M. Wallace, and Randall J. Roper
- Subjects
Male ,Multidisciplinary ,food and beverages ,Protein Serine-Threonine Kinases ,Protein-Tyrosine Kinases ,complex mixtures ,Catechin ,Drug Administration Schedule ,Mice ,Animals ,heterocyclic compounds ,Female ,sense organs ,Down Syndrome ,Muscle, Skeletal - Abstract
Bone abnormalities affect all individuals with Down syndrome (DS) and are linked to abnormal expression of DYRK1A, a gene found in three copies in people with DS and Ts65Dn DS model mice. Previous work in Ts65Dn male mice demonstrated that both genetic normalization of Dyrk1a and treatment with ~9 mg/kg/day Epigallocatechin-3-gallate (EGCG), the main polyphenol found in green tea and putative DYRK1A inhibitor, improved some skeletal deficits. Because EGCG treatment improved mostly trabecular skeletal deficits, we hypothesized that increasing EGCG treatment dosage and length of administration would positively affect both trabecular and cortical bone in Ts65Dn mice. Treatment of individuals with DS with green tea extract (GTE) containing EGCG also showed some weight loss in individuals with DS, and we hypothesized that weights would be affected in Ts65Dn mice after EGCG treatment. Treatment with ~20 mg/kg/day EGCG for seven weeks showed no improvements in male Ts65Dn trabecular bone and only limited improvements in cortical measures. Comparing skeletal analyses after ~20mg/kg/day EGCG treatment with previously published treatments with ~9, 50, and 200 mg/kg/day EGCG showed that increased dosage and treatment time increased cortical structural deficits leading to weaker appendicular bones in male mice. Weight was not affected by treatment in mice, except for those given a high dose of EGCG by oral gavage. These data indicate that high doses of EGCG, similar to those reported in some treatment studies of DS and other disorders, may impair long bone structure and strength. Skeletal phenotypes should be monitored when high doses of EGCG are administered therapeutically.
- Published
- 2021
6. Estimation of Connected Vehicle Penetration Rate on Indiana Roadways
- Author
-
Matthew Blackwell, Jijo K. Mathew, Margaret Hunter, Darcy M. Bullock, and Ed Cox
- Subjects
Connected vehicle ,connected vehicles ,penetration ,Environmental science ,Penetration (firestop) ,Penetration rate ,vehicle counts ,Marine engineering - Abstract
Over 400 billion passenger vehicle trajectory waypoints are collected each month in the United States. This data creates many new opportunities for agencies to assess operational characteristics of roadways for more agile management of resources. This study compared traffic counts obtained from 24 Indiana Department of Transportation traffic counts stations with counts derived by the vehicle trajectories during the same periods. These stations were geographically distributed throughout Indiana with 13 locations on interstates and 11 locations on state or US roads. A Wednesday and a Saturday in January, August, and September 2020 are analyzed. The results show that the analyzed interstates had an average penetration of 4.3% with a standard deviation of 1.0. The non-interstate roads had an average penetration of 5.0% with a standard deviation of 1.36. These penetration levels suggest that connected vehicle data can provide a valuable data source for developing scalable roadway performance measures. Since all agencies currently have a highway monitoring system using fixed infrastructure, this paper concludes by recommending agencies integrate a connected vehicle penetration monitoring program into their traditional highway count station program to monitor the growing penetration of connected cars and trucks.
- Published
- 2021
- Full Text
- View/download PDF
7. Noncompliance and Instrumental Variables for 2K Factorial Experiments
- Author
-
Matthew Blackwell and Nicole E. Pashley
- Subjects
Statistics and Probability ,Interactive effects ,Causal inference ,Statistics ,Instrumental variable ,Factorial experiment ,Statistics, Probability and Uncertainty ,Joint (geology) ,Mathematics - Abstract
Factorial experiments are widely used to assess the marginal, joint, and interactive effects of multiple concurrent factors. While a robust literature covers the design and analysis of these experiments, there is less work on how to handle treatment noncompliance in this setting. To fill this gap, we introduce a new methodology that uses the potential outcomes framework for analyzing 2K factorial experiments with noncompliance on any number of factors. This framework builds on and extends the literature on both instrumental variables and factorial experiments in several ways. First, we define novel, complier-specific quantities of interest for this setting and show how to generalize key instrumental variables assumptions. Second, we show how partial compliance across factors gives researchers a choice over different types of compliers to target in estimation. Third, we show how to conduct inference for these new estimands from both the finite-population and superpopulation asymptotic perspectives. Finally, we illustrate these techniques by applying them to a field experiment on the effectiveness of different forms of get-out-the-vote canvassing. New easy-to-use, open-source software implements the methodology. Supplementary materials for this article are available online.
- Published
- 2021
- Full Text
- View/download PDF
8. Analyzing Causal Mechanisms in Survey Experiments
- Author
-
Matthew Blackwell, Maya Sen, and Avidit Acharya
- Subjects
Mediation (statistics) ,Sociology and Political Science ,Computer science ,Average treatment effect ,Randomized experiment ,media_common.quotation_subject ,05 social sciences ,01 natural sciences ,0506 political science ,Supreme court ,010104 statistics & probability ,Identification (information) ,Perception ,Causal inference ,Political Science and International Relations ,050602 political science & public administration ,0101 mathematics ,media_common ,Cognitive psychology - Abstract
Researchers investigating causal mechanisms in survey experiments often rely on nonrandomized quantities to isolate the indirect effect of treatment through these variables. Such an approach, however, requires a “selection-on-observables” assumption, which undermines the advantages of a randomized experiment. In this paper, we show what can be learned about casual mechanisms through experimental design alone. We propose a factorial design that provides or withholds information on mediating variables and allows for the identification of the overall average treatment effect and the controlled direct effect of treatment fixing a potential mediator. While this design cannot identify indirect effects on its own, it avoids making the selection-on-observable assumption of the standard mediation approach while providing evidence for a broader understanding of causal mechanisms that encompasses both indirect effects and interactions. We illustrate these approaches via two examples: one on evaluations of US Supreme Court nominees and the other on perceptions of the democratic peace.
- Published
- 2018
- Full Text
- View/download PDF
9. Game Changers: Detecting Shifts in Overdispersed Count Data
- Author
-
Matthew Blackwell
- Subjects
Hierarchical Dirichlet process ,Sociology and Political Science ,Series (mathematics) ,Ex-ante ,Computer science ,05 social sciences ,Inference ,Bayesian inference ,01 natural sciences ,0506 political science ,010104 statistics & probability ,Salient ,Global terrorism ,Political Science and International Relations ,050602 political science & public administration ,Econometrics ,0101 mathematics ,Count data - Abstract
In this paper, I introduce a Bayesian model for detecting changepoints in a time series of overdispersed counts, such as contributions to candidates over the course of a campaign or counts of terrorist violence. To avoid having to specify the number of changepoint ex ante, this model incorporates a hierarchical Dirichlet process prior to estimate the number of changepoints as well as their location. This allows researchers to discover salient structural breaks and perform inference on the number of such breaks in a given time series. I demonstrate the usefulness of the model with applications to campaign contributions in the 2012 U.S. Republican presidential primary and incidences of global terrorism from 1970 to 2015.
- Published
- 2018
- Full Text
- View/download PDF
10. 25 A Retrospective Audit of Heart Failure Care in Older Patients at Campbelltown Hospital, NSW, with Reference to the 2017 ACCF/AHA Guideline
- Author
-
Dev Verick, Mark Hohenberg, Matthew Blackwell, and Allan Hsu
- Subjects
Aging ,medicine.medical_specialty ,Iron replacement ,business.industry ,General Medicine ,Guideline ,Audit ,medicine.disease ,Older patients ,Heart failure ,Emergency medicine ,medicine ,Frail elderly ,Geriatrics and Gerontology ,business - Abstract
Aim To determine if the management of a cohort of inpatients in our facility is compliant with the best practice guideline for heart failure from the American College of Cardiology 2017 management guideline and whether there were differences between younger and older patients. Method A retrospective ‘snapshot’ audit assessing all patients admitted to Campbelltown Hospital (CTN) with any ICD-10 heart failure code against the ACCF/AHA Guideline. Demographic data was collected; statistical analysis was performed using SPSS®. Results Sixty-three patients met inclusion criteria with a median age of 77 years and 83% from a private residence. A majority were frail (71%) and we found 87% presented with New York Heart Association class II or III heart failure symptoms. No patients received all the therapeutic options recommended by the guideline in our study. The majority if not all patients received basic investigations such as blood pathology and imaging. Compliance was found to be lower for echocardiography, iron replacement where required and use of the angiotensin blocker class of drugs. Comparing older and younger patients, there was a trend for older patients to be less likely to receive more advanced investigations and appropriate medications. Older patients in general have poorer outcomes comparing the younger. Conclusion Our study demonstrates an opportunity to improve compliance with the best practice guidelines for the management of heart failure. A robust coordinated approach, which is institution-centred, is required to improve compliance which has been shown to improve patient and health system outcomes.
- Published
- 2019
- Full Text
- View/download PDF
11. Explaining Causal Findings Without Bias: Detecting and Assessing Direct Effects
- Author
-
Matthew Blackwell, Maya Sen, and Avidit Acharya
- Subjects
Estimation ,Sociology and Political Science ,Fractionalization ,05 social sciences ,Direct effects ,Estimator ,Variance (accounting) ,0506 political science ,0502 economics and business ,Political Science and International Relations ,050602 political science & public administration ,Econometrics ,Treatment effect ,050207 economics ,Biostatistics ,Control (linguistics) ,Psychology - Abstract
Researchers seeking to establish causal relationships frequently control for variables on the purported causal pathway, checking whether the original treatment effect then disappears. Unfortunately, this common approach may lead to biased estimates. In this article, we show that the bias can be avoided by focusing on a quantity of interest called the controlled direct effect. Under certain conditions, the controlled direct effect enables researchers to rule out competing explanations—an important objective for political scientists. To estimate the controlled direct effect without bias, we describe an easy-to-implement estimation strategy from the biostatistics literature. We extend this approach by deriving a consistent variance estimator and demonstrating how to conduct a sensitivity analysis. Two examples—one on ethnic fractionalization’s effect on civil war and one on the impact of historical plough use on contemporary female political participation—illustrate the framework and methodology.
- Published
- 2016
- Full Text
- View/download PDF
12. Protesting Too Much: Revealing Repetitions in Barry Hannah’s Interviews
- Author
-
Matthew Blackwell
- Published
- 2017
- Full Text
- View/download PDF
13. A New Republic of Letters: Memory and Scholarship in the Age of Digital Reproduction by Jerome McGann
- Author
-
Matthew Blackwell
- Subjects
Scholarship ,History ,Rehabilitation ,Republic of Letters ,Physical Therapy, Sports Therapy and Rehabilitation ,Gender studies ,General Medicine ,Digital reproduction ,Classics - Published
- 2016
- Full Text
- View/download PDF
14. A Unified Approach to Measurement Error and Missing Data: Overview and Applications
- Author
-
Matthew Blackwell, Gary King, and James Honaker
- Subjects
Observational error ,Data collection ,Sociology and Political Science ,Computer science ,Computation ,05 social sciences ,Inference ,computer.software_genre ,Missing data ,01 natural sciences ,0506 political science ,010104 statistics & probability ,050602 political science & public administration ,Statistical inference ,Data mining ,0101 mathematics ,Special case ,computer ,Social Sciences (miscellaneous) ,Simple (philosophy) - Abstract
Although social scientists devote considerable eort to mitigating measurement error during data collection, they often ignore the issue during data analysis. And although many statistical methods have been proposed for reducing measurement error-induced biases, few have been widely used because of implausible assumptions, high levels of model dependence, dicult computation, or inapplicability with multiple mismeasured variables. We develop an easy-to-use alternative without these problems; it generalizes the popular multiple imputation (mi) framework by treating missing data problems as a limiting special case of extreme measurement error, and corrects for both. Like mi, the proposed framework is a simple two-step procedure, so that in the second step researchers can use whatever statistical method they would have if there had been no problem in the rst place. We also oer empirical illustrations, open source software that implements all the methods described herein, and a companion paper with technical details and extensions (Blackwell, Honaker and King, 2015b).
- Published
- 2015
- Full Text
- View/download PDF
15. How to Make Causal Inferences with Time-Series Cross-Sectional Data under Selection on Observables
- Author
-
Matthew Blackwell and Adam N. Glynn
- Subjects
Estimation ,Distributed lag ,Cross-sectional data ,Sociology and Political Science ,05 social sciences ,01 natural sciences ,0506 political science ,Weighting ,010104 statistics & probability ,Autoregressive model ,Causal inference ,Political Science and International Relations ,050602 political science & public administration ,Selection (linguistics) ,Econometrics ,0101 mathematics ,Set (psychology) - Abstract
Repeated measurements of the same countries, people, or groups over time are vital to many fields of political science. These measurements, sometimes called time-series cross-sectional (TSCS) data, allow researchers to estimate a broad set of causal quantities, including contemporaneous effects and direct effects of lagged treatments. Unfortunately, popular methods for TSCS data can only produce valid inferences for lagged effects under some strong assumptions. In this paper, we use potential outcomes to define causal quantities of interest in these settings and clarify how standard models like the autoregressive distributed lag model can produce biased estimates of these quantities due to post-treatment conditioning. We then describe two estimation strategies that avoid these post-treatment biases—inverse probability weighting and structural nested mean models—and show via simulations that they can outperform standard approaches in small sample settings. We illustrate these methods in a study of how welfare spending affects terrorism.
- Published
- 2018
- Full Text
- View/download PDF
16. A Selection Bias Approach to Sensitivity Analysis for Causal Effects
- Author
-
Matthew Blackwell
- Subjects
Estimation ,Selection bias ,021110 strategic, defence & security studies ,Matching (statistics) ,Sociology and Political Science ,media_common.quotation_subject ,05 social sciences ,0211 other engineering and technologies ,02 engineering and technology ,Regression ,0506 political science ,Weighting ,Causal inference ,Political Science and International Relations ,050602 political science & public administration ,Econometrics ,Sensitivity (control systems) ,Information bias ,media_common - Abstract
The estimation of causal effects has a revered place in all fields of empirical political science, but a large volume of methodological and applied work ignores a fundamental fact: most people are skeptical of estimated causal effects. In particular, researchers are often worried about the assumption of no omitted variables or no unmeasured confounders. This article combines two approaches to sensitivity analysis to provide researchers with a tool to investigate how specific violations of no omitted variables alter their estimates. This approach can help researchers determine which narratives imply weaker results and which actually strengthen their claims. This gives researchers and critics a reasoned and quantitative approach to assessing the plausibility of causal effects. To demonstrate the approach, I present applications to three causal inference estimation strategies: regression, matching, and weighting.
- Published
- 2014
- Full Text
- View/download PDF
17. Instrumental Variable Methods for Conditional Effects and Causal Interaction in Voter Mobilization Experiments
- Author
-
Matthew Blackwell
- Subjects
Statistics and Probability ,Mobilization ,media_common.quotation_subject ,05 social sciences ,Instrumental variable ,01 natural sciences ,Democracy ,0506 political science ,010104 statistics & probability ,Extant taxon ,Voting ,Causal inference ,050602 political science & public administration ,Econometrics ,Economics ,Voter turnout ,Diminishing returns ,0101 mathematics ,Statistics, Probability and Uncertainty ,media_common - Abstract
In democratic countries, voting is one of the most important ways for citizens to influence policy and hold their representatives accountable. And yet, in the United States and many other countries, rates of voter turnout are alarmingly low. Every election cycle, mobilization efforts encourage citizens to vote and ensure that elections reflect the true will of the people. To establish the most effective way of encouraging voter turnout, this article seeks to differentiate between (1) the synergy hypothesis that multiple instances of voter contact increase the effectiveness of a single form of contact, and (2) the diminishing returns hypothesis that multiple instances of contact are less effective or even counterproductive. Remarkably, previous studies have been unable to compare these hypotheses because extant approaches to analyzing experiments with noncompliance cannot speak to questions of causal interaction. I resolve this impasse by extending the traditional instrumental variables framework to accommodate multiple treatment–instrument pairs, which allows for the estimation of conditional and interaction effects to adjudicate between synergy and diminishing returns. The analysis of two voter mobilization field experiments provides the first evidence of diminishing returns to follow-up contact and a cautionary tale about experimental design for these quantities. Supplementary materials for this article are available online.
- Published
- 2017
- Full Text
- View/download PDF
18. A Framework for Dynamic Causal Inference in Political Science
- Author
-
Matthew Blackwell
- Subjects
Dilemma ,Sociology and Political Science ,Process (engineering) ,Causal inference ,Political Science and International Relations ,Econometrics ,Contrast (statistics) ,Omitted-variable bias ,Context (language use) ,Adversary ,Set (psychology) ,Psychology ,Data science - Abstract
Dynamic strategies are an essential part of politics. In the context of campaigns, for example, candidates continuously recalibrate their campaign strategy in response to polls and opponent actions. Traditional causal inference methods, however, assume that these dynamic decisions are made all at once, an assumption that forces a choice between omitted variable bias and posttreatment bias. Thus, these kinds of “single-shot” causal inference methods are inappropriate for dynamic processes like campaigns. I resolve this dilemma by adapting methods from biostatistics, thereby presenting a holistic framework for dynamic causal inference. I then use this method to estimate the effectiveness of an inherently dynamic process: a candidate’s decision to “go negative.” Drawing on U.S. statewide elections (2000–2006), I find, in contrast to the previous literature and alternative methods, that negative advertising is an effective strategy for nonincumbents. I also describe a set of diagnostic tools and an approach to sensitivity analysis.
- Published
- 2012
- Full Text
- View/download PDF
19. 'Awakenings': Electrocardiographic Findings in Central Sleep Apnea
- Author
-
Laszlo Littmann, J. Matthew Blackwell, and Ross M. Nesbit
- Subjects
Bradycardia ,Tachycardia ,medicine.medical_specialty ,Central sleep apnea ,business.industry ,Sleep apnea ,General Medicine ,medicine.disease ,Heart rate turbulence ,Physiology (medical) ,Anesthesia ,Internal medicine ,Heart rate ,Cardiology ,Breathing ,Medicine ,Sleep study ,medicine.symptom ,Cardiology and Cardiovascular Medicine ,business - Abstract
Central sleep apnea is an important but frequently missed clinical diagnosis. The purpose of this clinical case series is to demonstrate that in a subset of patients with central sleep apnea, inpatient telemetry ECG recordings may reveal a consistent relationship between changes in sinus rate, AV conduction, and the presence and rate of respiratory artifact that should raise the clinical suspicion of central sleep apnea. In the three presented cases, marked sinus bradycardia or AV block was followed by the simultaneous occurrence of abrupt acceleration of heart rate and the appearance of rapid micro-oscillations consistent with respiratory artifact. These changes suggested central sleep apnea characterized by bradycardia during the apneic spells followed by awakening of the breathing center and postvagal tachycardia. In each case, central sleep apnea was confirmed by visual observation of the patient, documentation of arterial desaturations during episodes of bradycardia, and in two, by a subsequent sleep study. Physicians should be aware of the potential and significance of these electrocardiographic disturbances in patients with central sleep apnea.
- Published
- 2010
- Full Text
- View/download PDF
20. The Political Legacy of American Slavery
- Author
-
Maya Sen, Avidit Acharya, and Matthew Blackwell
- Subjects
African american population ,Racial threat ,Affirmative action ,Sociology and Political Science ,media_common.quotation_subject ,Interpretation (philosophy) ,05 social sciences ,Gender studies ,Criminology ,0506 political science ,Politics ,Incentive ,Spanish Civil War ,Feeling ,Political economy ,Political science ,0502 economics and business ,050602 political science & public administration ,Racial resentment ,050207 economics ,media_common - Abstract
We show that contemporary differences in political attitudes across counties in the American South in part trace their origins to slavery’s prevalence more than 150 years ago. Whites who currently live in Southern counties that had high shares of slaves in 1860 are more likely to identify as a Republican, oppose affirmative action, and express racial resentment and colder feelings toward blacks. We show that these results cannot be explained by existing theories, including the theory of contemporary racial threat. To explain the results, we offer evidence for a new theory involving the historical persistence of political attitudes. Following the Civil War, Southern whites faced political and economic incentives to reinforce existing racist norms and institutions to maintain control over the newly freed African American population. This amplified local differences in racially conservative political attitudes, which in turn have been passed down locally across generations.
- Published
- 2014
- Full Text
- View/download PDF
21. Amelia II: A Program for Missing Data
- Author
-
Matthew Blackwell, Gary King, and James Honaker
- Subjects
Statistics and Probability ,multiple imputation ,Computer science ,business.industry ,Markov chain Monte Carlo ,Machine learning ,computer.software_genre ,Missing data ,R package ,symbols.namesake ,missing data ,Bayesian priors ,symbols ,Imputation (statistics) ,Artificial intelligence ,Data mining ,Statistics, Probability and Uncertainty ,business ,bootstrap ,computer ,lcsh:Statistics ,lcsh:HA1-4737 ,Software ,Graphical user interface - Abstract
Amelia II is a complete R package for multiple imputation of missing data. The package implements a new expectation-maximization with bootstrapping algorithm that works faster, with larger numbers of variables, and is far easier to use, than various Markov chain Monte Carlo approaches, but gives essentially the same answers. The program also improves imputation models by allowing researchers to put Bayesian priors on individual cell values, thereby including a great deal of potentially valuable and extensive information. It also includes features to accurately impute cross-sectional datasets, individual time series, or sets of time series for different cross-sections. A full set of graphical diagnostics are also available. The program is easy to use, and the simplicity of the algorithm makes it far more robust; both a simple command line and extensive graphical user interface are included.
- Published
- 2011
22. cem: Coarsened Exact Matching in Stata
- Author
-
Stefano Maria Iacus, Gary King, Giuseppe Porro, Matthew Blackwell, Blackwell, M, IACUS S., M, King, G, and Porro, Giuseppe
- Subjects
Observational error ,imbalance ,multiple imputation ,business.industry ,matching ,Causal effect ,Exact matching ,cem ,coarsened exact matching ,causal inference ,balance ,Monotonic function ,Machine learning ,computer.software_genre ,Mathematics (miscellaneous) ,Bounding overwatch ,Causal inference ,Covariate ,Artificial intelligence ,Invariant (mathematics) ,business ,computer ,Algorithm ,Mathematics - Abstract
In this article, we introduce a Stata implementation of coarsened exact matching, a new method for improving the estimation of causal effects by reducing imbalance in covariates between treated and control groups. Coarsened exact matching is faster, is easier to use and understand, requires fewer assumptions, is more easily automated, and possesses more attractive statistical properties for many applications than do existing matching methods. In coarsened exact matching, users temporarily coarsen their data, exact match on these coarsened data, and then run their analysis on the uncoarsened, matched data. Coarsened exact matching bounds the degree of model dependence and causal effect estimation error by ex ante user choice, is monotonic imbalance bounding (so that reducing the maximum imbalance on one variable has no effect on others), does not require a separate procedure to restrict data to common support, meets the congruence principle, is approximately invariant to measurement error, balances all nonlinearities and interactions in sample (i.e., not merely in expectation), and works with multiply imputed datasets. Other matching methods inherit many of the coarsened exact matching method's properties when applied to further match data preprocessed by coarsened exact matching. The cem command implements the coarsened exact matching algorithm in Stata.
- Published
- 2009
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.