152 results on '"DECISION"'
Search Results
2. Using Multimedia Approaches to Communicate Probabilities in Patient Decision Aids for Low-Literacy Populations: Randomized Trial
- Author
-
Foundation for Informed Medical Decision Making
- Published
- 2023
3. Double-Blind Randomized Controlled Trial for the Evaluation of a Novel Adaptive Attention Training in Healthy Adolescents
- Author
-
Alliance for Decision Education
- Published
- 2023
4. Regional Educational Laboratory Researcher-Practitioner Partnerships: Documenting the Research Alliance Experience. REL 2018-291
- Author
-
National Center for Education Evaluation and Regional Assistance (ED), Decision Information Resources, Inc., Scher, Lauren, McCowan, Ronald, and Castaldo-Walsh, Cynthia
- Abstract
This report provides a detailed account of the Regional Educational Laboratory (REL) Program's experience establishing and supporting research-practice partnerships (called "research alliances") during its 2012-17 contract cycle. The report adds to the growing literature base on researcher-practitioner partnerships by sharing how the RELs reported creating, engaging, and maintaining multiple partnerships, with the purpose of informing future collaborative efforts for researchers and practitioners and for those who wish to support research-practice partnerships. It addresses questions about: how REL research alliances fit within the broader context of research-practice partnerships; what characteristics existed among REL research alliances and how they evolved over time; and what challenges RELs reported experiencing while establishing and supporting research alliances and the strategies RELs employed to address those challenges. Finally, the paper discusses the implications of the REL research alliance experience for other networks of research-practice partnerships.
- Published
- 2018
5. Multi-Armed RCTs: A Design-Based Framework. NCEE 2017-4027
- Author
-
National Center for Education Evaluation and Regional Assistance (ED), Decision Information Resources, Inc., and Schochet, Peter Z.
- Abstract
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies (Imbens and Rubin, 2015; Schochet, 2015, 2016). The estimators are derived using the building blocks of experimental designs with minimal assumptions, and are unbiased and normally distributed in large samples with simple variance estimators. The methods apply to randomized controlled trials (RCTs) and quasi-experimental designs (QEDs) with comparison groups for a wide range of designs used in social policy research. The methods have important advantages over traditional model-based impact estimation methods, such as hierarchical linear model (HLM) and robust cluster standard error (RCSE) methods, and perform well in simulations (Schochet, 2016). The free "RCT-YES" software (www.rct-yes.com) estimates and reports impacts using these design-based methods. This report discusses several key topics for estimating average treatment effects (ATEs) for multi-armed designs. The report is geared toward methodologists with a strong background in statistical theory and a good knowledge of design-based concepts for the single treatment-control group (two-group) design. The report builds on Schochet (2016), referencing key results and formulas to avoid repetition, and serves as a supplement to that report. The focus is on RCTs, although key concepts apply also to QEDs with comparison groups. The report is in three sections. Section 1 discusses how design-based ATE estimators for the two-group design need to be modified for the multi-armed design when comparing pairs of research groups to each other. Section 2 discusses multiple comparison adjustments when conducting hypothesis tests across pairwise contrasts to identify the most effective interventions. Finally, Section 3 shows that the assumptions required to identify and estimate the complier average causal effect (CACE) parameter using an instrumental variable (IV) framework become much more complex in the multi-armed context, and may not be possible in some cases. While Sections 2 and 3 are germane to multi-armed designs regardless of the impact estimation methods used for the analysis, these sections emphasize approaches that align with the non-parametric underpinnings of the design-based framework. [For related reports see: "Multi-Armed RCTs: A Design-Based Framework. NCEE 2017-4027" (ED575014) and "Comparing Impact Findings from Design-Based and Model-Based Methods: An Empirical Investigation. NCEE 2017-4026" (ED575021).]
- Published
- 2017
6. Comparing Impact Findings from Design-Based and Model-Based Methods: An Empirical Investigation. NCEE 2017-4026
- Author
-
National Center for Education Evaluation and Regional Assistance (ED), Decision Information Resources, Inc., Kautz, Tim, Schochet, Peter Z., and Tilley, Charles
- Abstract
A new design-based theory has recently been developed to estimate impacts for randomized controlled trials (RCTs) and basic quasi-experimental designs (QEDs) for a wide range of designs used in social policy research (Imbens & Rubin, 2015; Schochet, 2016). These methods use the potential outcomes framework and known features of study designs to connect statistical methods to the building blocks of causal inference. They differ from model-based methods that have commonly been used in education research, including hierarchical linear model (HLM) methods and robust cluster standard error (RCSE) methods for clustered designs. In comparison to model-based methods, the design-based methods tend to make fewer assumptions about the nature of the data and also more explicitly account for known information about the experimental and sampling designs. While these theoretical differences suggest the corresponding estimates might differ, it is unclear how much of a practical difference it makes to use design-based methods versus more conventional model-based methods. This study addresses this question by re-analyzing nine past RCTs in the education area using both design- and model-based methods. The study uses real data, rather than simulated data, to better explore the differences that would arise in practice. In order to investigate the full scope of differences between the methods, the study uses data generated from different types of randomization designs commonly used in social policy research: (1) non-clustered designs in which individuals are randomized; (2) clustered designs in which groups are randomized; (3) non-blocked designs in which randomization is conducted for a single population; and (4) blocked (stratified) designs in which randomization is conducted separately within partitions of the sample. The study conducts the design-based analyses using "RCT-YES," a free software package funded by the Institute of Education Sciences (IES) that applies design-based methods to a wide range of RCT designs (www.rct-yes.com). This report focuses on two analyses that compare model- and design-based methods, both of which suggest there is little substantive difference in the results between the two methods. For both analyses, the study uses a reference model-based method that is similar to the one used in the original evaluation. In the first analysis, the study compares the reference model-based method to a design-based method with underlying assumptions that most closely align with those of the reference model-based method. In the second analysis, the report presents a sensitivity check that compares the reference model-based method to an alternative design-based method. In particular, the alternative method is based on the default settings in the" RCT-YES" software, which correspond to an alternative set of plausible assumptions. The findings from both analyses suggest that model- and design-based methods yield very similar results in terms of the magnitude of impact estimates, statistical significance of the impact estimates, and implications for policy. To contextualize the differences in impact estimates between design- and model-based methods, the report also presents a third analysis, which compares estimates from two commonly used model-based methods: (1) HLM methods; and (2) linear models with ordinary least squares (OLS) assumptions and RCSE to account for clustering. Importantly, this analysis suggests that the differences between the design- and model-based methods (with similar assumptions) are no greater than the differences that would arise between commonly used, model-based methods. The study suggests that researchers should select estimators with assumptions that best suit the goals of their study regardless of whether they use a design- or model-based approach. Moreover, researchers should consider the trade offs between different assumptions, and how these assumptions affect the interpretation of findings. Appended are: (1) Hierarchical linear model methods; and (2) Detailed description of studies and results. [For related reports see: "What Is Design-Based Causal Inference for RCTs and Why Should I Use It? NCEE 2017-4025" (ED575014)and "Multi-Armed RCTs: A Design-Based Framework. NCEE 2017-4027 (ED575022).]
- Published
- 2017
7. What Is Design-Based Causal Inference for RCTs and Why Should I Use It? NCEE 2017-4025
- Author
-
Decision Information Resources, Inc., National Center for Education Evaluation and Regional Assistance (ED), and Schochet, Peter Z.
- Abstract
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The impact estimators are derived using the building blocks of experimental designs with minimal assumptions, and have good statistical properties. The methods apply to randomized controlled trials (RCTs) and quasi-experimental designs (QEDs) with treatment and control (comparison) groups. Importantly, design-based estimators are acceptable for What Works Clearinghouse (WWC) evidence reviews (Scher and Cole, 2017). Although the fundamental concepts that underlie design-based methods are straightforward, the literature on these methods is technical, with detailed mathematical proofs required to formalize the theory. Thus, the daunting task of wading through this literature might discourage some education researchers from using design-based methods in favor of more traditional "model-based" methods, such as hierarchical linear modeling (HLM) (Raudenbush and Bryk, 2002). This brief aims to broaden knowledge of design-based methods by providing intuition on their key concepts and how they compare to model-based methods as typically implemented. Using simple mathematical notation, the brief is geared toward education researchers with a good knowledge of evaluation designs and HLM. The discussion synthesizes Schochet (2016), omitting details for brevity and accessibility. The focus is on RCTs, although key concepts apply also to QEDs. [For related reports see: "Comparing Impact Findings from Design-Based and Model-Based Methods: An Empirical Investigation. NCEE 2017-4026" (ED575021) and "Multi-Armed RCTs: A Design-Based Framework. NCEE 2017-4027" (ED575022).]
- Published
- 2017
8. Studying Educational Effectiveness in Rural Settings: A Guide for Researchers
- Author
-
Decision Information Resources, Inc., National Center for Research on Rural Education (R2Ed), Sheridan, Susan, Dynarski, Mark, and Bovaird, James
- Abstract
This guide provides experienced education researchers with suggestions for conducting high-quality effectiveness studies to overcome research challenges common to rural settings. The guide addresses four factors that researchers must consider when conducting educational effectiveness research in rural settings: (1) study design, (2) recruitment of participants, (3) supporting and monitoring implementation of the intervention, and (4) data collection. The guide presents economical study designs that can help researchers achieve adequate statistical precision, use cost-effective strategies to support and monitor implementation, and develop alternative approaches for reducing the costs of data collection. [This guide was written with the assistance of Leslie Hawley, Amanda Witte, Shannon Holmes, Mike Coutts, and Ann Arthur].
- Published
- 2017
9. Descriptive Analysis in Education: A Guide for Researchers. NCEE 2017-4023
- Author
-
National Center for Education Evaluation and Regional Assistance (ED), Decision Information Resources, Inc., Loeb, Susanna, Dynarski, Susan, McFarland, Daniel, Morris, Pamela, Reardon, Sean, and Reber, Sarah
- Abstract
Whether the goal is to identify and describe trends and variation in populations, create new measures of key phenomena, or describe samples in studies aimed at identifying causal effects, description plays a critical role in the scientific process in general and education research in particular. Descriptive analysis identifies patterns in data to answer questions about who, what, where, when, and to what extent. This guide describes how to more effectively approach, conduct, and communicate quantitative descriptive analysis. The primary audience for this guide includes members of the research community who conduct and publish both descriptive and causal studies, although it could also be useful for policymakers and practitioners who are consumers of research findings. The guide contains chapters that discuss the important role descriptive analysis plays; how to approach descriptive analysis; how to conduct descriptive analysis; and how to communicate descriptive analysis findings. The following are appended: (1) Resources Related Especially to Communications and Visualization; and (2) References. [Tom Szuba was a contributing writer for this report.]
- Published
- 2017
10. What Does It Mean When a Study Finds No Effects? REL 2017-265
- Author
-
Decision Information Resources, Inc., National Center for Education Evaluation and Regional Assistance (ED), and Seftor, Neil
- Abstract
This short brief for education decision makers discusses three main factors that may contribute to a finding of no effects: failure of theory, failure of implementation, and failure of research design. It provides readers with questions to ask themselves to better understand "no effects" findings, and describes other contextual factors to consider when deciding what to do next.
- Published
- 2016
11. Going Public: Writing about Effectiveness Studies and Their Results. REL 2016-173
- Author
-
Decision Information Resources, Inc., National Center for Education Evaluation and Regional Assistance (ED), and Dynarski, Mark
- Abstract
This brief provides tips writers can use to make impact research more digestible and actionable for policymakers and practitioners. The brief emphasizes five tips: make the contrast clear, make causal statements only when they result from causal research designs, present numbers simply and concretely, describe effects in meaningful units, and present findings as information for policy. The brief provides positive examples of plain writing in the five key aspects of an effectiveness study to support the writer's ability to use each of these five tips.
- Published
- 2016
12. Electronic Decision Support for Intervention in Poorly Controlled Type 2 Diabetes (PATH)
- Author
-
PATH Decision Support Software, LLC
- Published
- 2019
13. QuitAdvisorDDS: A Point-of-Care Tobacco Cessation Tool for Dental Settings (QA-DDS)
- Author
-
National Institute of Dental and Craniofacial Research (NIDCR), University of Alabama at Birmingham, The National Dental Practice-Based Research Network, Health Decision Technologies, LLC, and Jamie Studts, Principle investigator
- Published
- 2019
14. Designing and Conducting Strong Quasi-Experiments in Education. Version 2
- Author
-
Decision Information Resources, Inc., Scher, Lauren, Kisker, Ellen, and Dynarski, Mark
- Abstract
The purpose of this paper is to describe best practices in designing and implementing strong quasi-experimental designs (QED) when assessing the effectiveness of policies, programs or practices. The paper first discusses the issues researchers face when choosing to conduct a QED, as opposed to a more rigorous randomized controlled trial design. Next, the paper documents four sets of best practices in designing and implementing QEDs including: (1) considering unobserved variables, (2) selecting appropriate matching strategies for creating comparison groups, (3) following general guidelines for sound research, and (4) addressing the What Works Clearinghouse (WWC) standards. The paper then presents 31 frequently-asked questions related to QEDs meeting WWC standards with reservations and discusses common pitfalls that cause studies not to meet WWC standards. Topics covered in the frequently asked questions section include: study design/group formation, outcomes, confounding factors, baseline equivalence, sample loss and power, and analytic techniques. The paper provides a detailed checklist of issues researchers should consider when designing a strong QED study, and discusses why each issue is important and how it relates to WWC standards. One appendix is included: Checklist for Quasi-Experimental Designs; Study Design Characteristics to Consider.
- Published
- 2015
15. A Guide to Using State Longitudinal Data for Applied Research. NCEE 2015-4013
- Author
-
National Center for Education Evaluation and Regional Assistance (ED), Decision Information Resources, Inc., Levesque, Karen, Fitzgerald, Robert, and Pfeiffer, Jay
- Abstract
State longitudinal data systems (SLDSs) promise a rich source of data for education research. SLDSs contain statewide student data that can be linked over time and to additional data sources for education management, reporting, improvement, and research, and ultimately for informing education policy and practice. Authored by Karen Levesque, Robert Fitzgerald, and Joy Pfeiffer of RTI International, this guide is intended for researchers who are familiar with research methods, but who are new to using SLDS data, are considering conducting SLDS research in a new state environment, or are expanding into new topic areas that can be explored using SLDS data. The guide also may be useful for state staff as background for interacting with researchers and may help state staff and researchers communicate across their two cultures. It highlights the opportunities and constraints that researchers may encounter in using state longitudinal data systems and offers approaches to addressing some common problems. The following are appended: (1) Sample High School Feedback Reports; (2) Characteristics of statewide student data systems, by state: 2009-2010; (3) The Family Education Rights and Privacy Act--guidance for reasonable methods and written agreements; and (4) Additional questions for confirming specific data availability.
- Published
- 2015
16. Statistical Theory for the 'RCT-YES' Software: Design-Based Causal Inference for RCTs. NCEE 2015-4011
- Author
-
National Center for Education Evaluation and Regional Assistance (ED), Decision Information Resources, Inc., and Schochet, Peter Z.
- Abstract
This report presents the statistical theory underlying the "RCT-YES" software that estimates and reports impacts for RCTs for a wide range of designs used in social policy research. The report discusses a unified, non-parametric design-based approach for impact estimation using the building blocks of the Neyman-Rubin-Holland causal inference model that underlies experimental designs. This approach differs from the more model-based impact estimation methods that are typically used in education research. The report discusses impact and variance estimation, asymptotic distributions of the estimators, hypothesis testing, the inclusion of baseline covariates to improve precision, the use of weights, subgroup analyses, baseline equivalency analyses, and estimation of the complier average causal effect parameter. A section on mathematical proofs is appended.
- Published
- 2015
17. The Right Tools for the Job--Technology Options for Adult Online Learning and Collaboration
- Author
-
Institute of Education Sciences (ED) and Decision Information Resources, Inc.
- Abstract
Many options exist for using technology as a tool for adult learning, and each day, it becomes easier to share information online than it ever has been. Online learning technology has grown from one-sided communications to numerous options for audience engagement and interactivity. This guide introduces a variety of tools, online platforms, and terms related to current online learning technology and gives advice on selecting the best option for the reader's needs. Users of these tools are encouraged to "test drive" a variety of approaches before identifying the best approach to use in different circumstances. Appendix A provides a glossary of online learning terms. For information on how to create online learning sessions, consider the list of resources provided in Appendix B. Contents include: (1) Goal and Definitions; and (2) Choosing the Right Technology: Three Types of Tools (Webinars, Screen Shares and Conference Calls, and Video Chats).
- Published
- 2014
18. Forming a Team to Ensure High-Quality Measurement in Education Studies. REL 2014-052
- Author
-
National Center for Education Evaluation and Regional Assistance (ED), Decision Information Resources, Inc., Kisker, Ellen Eliason, and Boller, Kimberly
- Abstract
This brief provides tips for forming a team of staff and consultants with the needed expertise to make key measurement decisions that will ensure high-quality data for answering the study's research questions. The brief outlines the main responsibilities of measurement team members. It also describes typical measurement tasks and discusses how the measurement team members can work together to complete the measurement tasks successfully. A list of additional resources is included.
- Published
- 2014
19. Reporting What Readers Need to Know about Education Research Measures: A Guide. REL 2014-064
- Author
-
Decision Information Resources, Inc., National Center for Education Evaluation and Regional Assistance (ED), Boller, Kimberly, and Kisker, Ellen Eliason
- Abstract
This guide is designed to help researchers make sure that their research reports include enough information about study measures so that readers can assess the quality of the study's methods and results. The guide also provides examples of write-ups about measures and suggests resources for learning more about these topics. The guide assumes that researchers have: (1) clearly articulated their research questions; (2) completed a rigorous review of the leading measures for assessing each necessary component of the theory of change and the relevant domains and constructs; and (3) selected measures that are aligned with the intervention's theory of change (also referred to as a logic model), and that address the study's research questions. These measures may include contextual factors, inputs to implementation, expected intervention activities, out- puts, and both short-term and long-term outcomes (Lugo-Gil et al. 2011; W.K. Kellogg 2004). Also provided are five checklists to help researchers provide complete information describing: (1) their study's measures; (2) data collection training and quality; (3) the study's reference population, study sample, and measurement timing; (4) evidence of the reliability and construct validity of the measures; and (5) missing data and descriptive statistics. The brief includes an example of parts of a report's methods and results section illustrating how the checklists can be used to check the completeness of reporting. A bibliography is also included. The appendix contains "Measures Reporting Checklist for Researchers".
- Published
- 2014
20. Graphic Design for Researchers
- Author
-
Institute of Education Sciences (ED), Decision Information Resources, Inc., and Mathematica Policy Research, Inc.
- Abstract
Technology continues to radically change how we create and consume information. Today, news, reports, and other material are often delivered quickly through pictures, colors, or other eye-catching visual elements. Words still matter, but they may be tweeted, viewed on a smartphone, or placed in a call-out box in a report. The design of these items can greatly affect whether your reader notices, reads, or understands the words that you write. This guide offers a basic overview on how researchers can effectively use design to create engaging and visually appealing Regional Educational Laboratory (REL) products. It will cover some key concepts behind good design and discuss how to use basic elements like photographs, images, color, tables, figures, and type to create useful publications and digital products. The guide also touches on how researchers can use data visualization to make complex concepts accessible.
- Published
- 2014
21. Comparison of Ways to Prepare Patients for Decisions About Joint Replacement Surgery
- Author
-
Foundation for Informed Medical Decision Making, The Ottawa Hospital, Queensway Carleton Hospital, University of Ottawa, University of Toronto, University of Chicago, Northwestern University Feinberg School of Medicine, Northwestern University, and Dartmouth-Hitchcock Medical Center
- Published
- 2018
22. Federal Evaluation of Selected Programs for Expectant and Parenting Youth (PEPY)
- Author
-
The Office of Adolescent Health, HHS and Decision Information Resources, Inc.
- Published
- 2018
23. Study Testing Patient Decision Tools Related to the Risks and Benefits of Weight Loss Surgery (POINT of View)
- Author
-
Foundation for Informed Medical Decision Making
- Published
- 2017
24. Teen Pregnancy Prevention Replication Study
- Author
-
Department of Health and Human Services and Decision Information Resources (DIR)
- Published
- 2017
25. Integrated Postsecondary Education Data System Data Quality Study. Methodology Report. NCES 2005-175
- Author
-
National Center for Education Statistics (ED), Washington, DC., Decision Information Resources, Inc., Houston, TX., Mathematica Policy Research, Inc., Cambridge, MA., Jackson, Kenneth W., Peecksen, Scott, Jang, Donsig, and Sukasih, Amang
- Abstract
The Integrated Postsecondary Education Data System (IPEDS) of the National Center for Education Statistics (NCES) was initiated in 1986 to collect data about all identified institutions whose primary purpose is to provide postsecondary education. Postsecondary education is defined within IPEDS as "the provision of a formal instructional program whose curriculum is designed primarily for students who are beyond the compulsory age for high school. This includes programs whose purpose is academic, vocational, and continuing education, and excludes avocational and adult basic education programs." Since 1992, IPEDS has focused on institutions participating in Title IV Federal Financial Assistance programs. In fact, institutions participating in Title IV Federal Financial Assistance programs are required to provide IPEDS data. IPEDS data are collected as nine separate components--institutional characteristics, completions, employees by assigned position, salaries, fall staff, enrollment, student financial aid, finance, and graduation rates. Tuition and price data are collected as a part of the institutional characteristics component. These data are collected annually in three distinct data collections: fall, winter, and spring. Each collection uses web-based survey procedures. Appended are: (1) Imputation Flag Values; (2) Variables in the Final Analysis File for the Tuition and Price Component; (3) Variables in the Final Analysis File for the Employees by Assigned Position Component; (4) Variables in the Final Analysis File for the Completions Component; (5) Variables in the Final Analysis File for the Enrollment Component; (F) Variables in the Final Analysis File for the Student Financial Aid Component; (6) Variables in the Final Analysis File for the Finance Component; (7) Variables in the Final Analysis File for the Salaries Component; and (8) Variables in the Final Analysis File for the Graduation Rates Component. (Contains 171 tables.)
- Published
- 2005
26. Head Start Impact Study: First Year Findings
- Author
-
Administration for Children and Families (DHHS), Office of Planning, Research & Evaluation, Westat, Inc., Urban Institute, Decision Information Resources, Inc., American Institutes for Research, Chesapeake Research Associates, LLC, Puma, Michael, Bell, Stephen, and Cook, Ronna
- Abstract
The Congressionally-mandated Head Start Impact Study is being conducted across 84 nationally representative grantee/delegate agencies. Approximately 5,000 newly entering 3- and 4-year-old children applying for Head Start were randomly assigned to either a Head Start group that had access to Head Start program services or to a non-Head Start group that could enroll in available community non-Head Start services, selected by their parents. Data collection began in fall 2002 and is scheduled to continue through 2006, following children through the spring of their 1st-grade year. The study quantifies the impact of Head Start separately for 3- and 4-year-old children across child cognitive, social-emotional, and health domains as well as on parenting practices. For children in the 3-year-old group, the preliminary results from the first year of data collection demonstrate small to moderate positive effects favoring the children enrolled in Head Start for some outcomes in each domain. Fewer positive impacts were found for children in the 4-year-old group. Appended are: (1) Section 649(G) of the Head Start Act, 1998 (PL 105-285); (2) Calculating Analytical Sampling Weights for Fall 2002 and Spring 2003; (3) Language Decision Form; (4) Citations for Child Assessments, Scales, and Observation Instruments; (5) Comparison of Head Start Grantees/Delegate Agencies and Centers in Saturated and Non-Saturated Communities; (6) Determination of Head Start Participation; (7) The Racial/Ethnic Composition of the Study Sample; (8) Differences between Main Arrangement and Focal Arrangement; (9) Imputations for Item Nonresponse in the Fall 2002 Data; (10) Comparison of Weighted and Unweighted Mean Differences by Age Cohort; (11) Impact Regression Procedures; (12) Measures Of Fall 2002 "Starting Points" Used in the Regression Models, by Child and Parent Outcomes; (13) Tests for Lack of Impact of Head Start on Demographic and Developmental Factors Measured in Fall 2002; (14) Basis for Assuming That Non-Participants Experienced No Intervention Effects; (15) Cognitive Domain, Estimated Impact on Program Participants; (16) Factors That Moderate the Impact of Head Start: Detailed Tables for Cognitive Outcomes; (17) Social-Emotional Domain Estimated Impact on Program Participants; (18) Factors That Moderate the Impact of Head Start: Detailed Tables for Social-Emotional Outcomes; (19) Health Domain, Estimated Impact of Program Participation; (20) Factors That Moderate the Impact of Head Start: Detailed Tables for Health Outcomes; (21) Parenting Practices Domain, Estimated Impact of Program Participation; and (22) Factors That Moderate the Impact of Head Start: Detailed Tables for Parenting Outcomes. Individual sections contain footnotes. (Contains 47 exhibits.) [Contributors include: Nicholas Zill, Gary Shapiro, Pam Broene, Debra Mekos, Monica Rohacek, Liz Quinn, Gina Adams, Janet Friedman, and Haidee Bernstein. For the "Head Start Impact Study: First Year Findings. Executive Summary," see ED543009.]
- Published
- 2005
27. Activating Seniors to Improve Chronic Disease Care
- Author
-
Foundation for Informed Medical Decision Making
- Published
- 2016
28. Wiser Choices in Osteoporosis Choice II: A Decision Aid for Patients and Clinicians
- Author
-
Foundation for Informed Medical Decision Making, Olmsted Medical Center, and Victor Montori, MD
- Published
- 2016
29. National Job Corps Study: The Impacts of Job Corps on Participants' Employment and Related Outcomes [and] Methodological Appendixes on the Impact Analysis.
- Author
-
Mathematica Policy Research, Washington, DC., Battelle Memorial Inst., Seattle, WA., Decision Information Resources, Inc., Houston, TX., Schochet, Peter Z., Burghardt, John, and Glazerman, Steven
- Abstract
A study involving random assignment of all youth eligible for Job Corps to either a Job Corps program or to a control group was conducted to assess the impact of Job Corps on key participant outcomes. Participants in the study were nationwide youth eligible for Job Corps who applied for enrollment for the first time between November 16, 1994, and December 17, 1995. The study sought to determine the following:(1) how effectively Job Corps improves the employability of disadvantaged participants, (2) whether Job Corps impacts differ for youths with different baseline characteristics, and (3) how effective the residential and nonresidential components of Job Corp are. Findings over the first 4 years after random assignment include the following: (1) Job Corps provided extensive education, training, and other services to the program group and improved their educational attainment; (2) Job Corps generated positive employment and earnings impacts by the beginning of the third year after random assignment and the impacts persisted through the fourth year; (3) employment and earnings gains were found broadly across most subgroups of students; (4) the resident and nonresidential programs were each effective for the youths they served; (5) Job Corps significantly reduced youths' involvement with the criminal justice system; (6) Job Corps had small beneficial impacts on the receipt of public assistance and self-assessed health status, but no impacts on illegal drug use; and (7) Job Corps had no impacts on fertility or custodial responsibility, but it slightly promoted independent living and mobility. (The report include numerous tables and charts, 31 references, and five appendixes concerning the study methodology.) (KC)
- Published
- 2001
30. National Job Corps Study: The Benefits and Costs of Job Corps.
- Author
-
Mathematica Policy Research, Washington, DC., Battelle Memorial Inst., Seattle, WA., Decision Information Resources, Inc., Houston, TX., McConnell, Sheena, and Glazerman, Steven
- Abstract
A benefit-cost analysis of the Job Corps program compared groups randomly assigned to either enroll in the program or to constitute a control group that did not enroll. Youth who participated in the study were those found eligible for Job Corps nationwide between November 1994 and February 1996. Interviews with participants and the assignment of dollar values to costs and benefits were among the research methods used. Benefits and costs measured included the following: (1) benefits of increased output resulting from the additional productivity of Job Corps participants; (2) benefits from reduced use of other programs and services; (3) benefits from reduced crime committed by or against participants; and (4) program costs and costs of resources used by Job Corps. Benefits and costs were measured from the perspectives of society as a whole, participants, and the rest of society (non-participants in Job Corps). The study's findings indicate that Job Corps is a good investment. The benefits to society exceed the costs of the program by nearly $17,000 per participant, assuming that the observed earnings impacts do not decline rapidly as participants get older. The researchers say evidence from other studies suggests the impacts will persist without rapid decay. The study concluded that Job Corps is a valuable program whose benefits exceed costs over a wide spectrum of student groups and for several areas of society. (The report includes 28 tables and 11 figures.) (KC)
- Published
- 2001
31. National Job Corps Study: Impacts by Center Characteristics.
- Author
-
Mathematica Policy Research, Princeton, NJ., Battelle Human Affairs Research Centers, Seattle, WA., Decision Information Resources, Inc., Houston, TX., Burghardt, John, and Schochet, Peter Z.
- Abstract
The question of whether the Job Corps's impacts on students' employment and related outcomes differ according to the characteristics of the Job Corps center attended was examined. The study sample consisted of approximately 9,400 program group members and 6,000 control group members who were randomly selected from among the nearly 81,000 applicants nationwide who applied for Job Corps services for the first time between November 17, 1994, and December 16, 1995, and were found eligible for services by February 1996. Study participants were interviewed shortly after their random assignment and 12, 30, and 48 months thereafter. Job Corps impacts were similar for contract centers and Civilian Conservation Centers. Impacts were similar in large, medium, and small centers. The beneficial impacts of the Job Corps program overall were broadly distributed throughout the country and not confined to a few regions. Impacts were similar for centers rated as high-performing, medium-performing, and low-performing centers based on the Job Corps performance measurement system. As expected, outcomes of the program group were better among the high-performing centers. However, so too were the outcomes of the control group who would have attended the high-performing centers. (Twenty-eight tables/figures are included. Eight supplementary tables are appended.) (MN)
- Published
- 2001
32. Does Job Corps Work? Summary of the National Job Corps Study. Summary Report.
- Author
-
Decision Information Resources, Inc., Houston, TX., Mathematica Policy Research, Princeton, NJ., Battelle Memorial Inst., Seattle, WA., Burghardt, John, Schochet, Peter Z., McConnell, Sheena, Johnson, Terry, Gritz, R. Mark, Glazerman, Steven, Homrighausen, John, and Jackson, Russell
- Abstract
The National Job Corps Study is based on a national random sample of all eligible applicants to Job Corps in late 1994 and 1995. The sampled youth were assigned randomly to either a program group whose members could enroll in Job Corps or a control group whose members could enroll in all other programs available to them in their communities. Findings related to delivering services indicated Job Corps centers effectively deliver the planned services called for by the program model and Job Corps provides extensive education, training, and other services. Findings related to making a difference show Job Corps substantially increases the education and training services that youths receive and improves their skills and educational attainment; Job Corps generates employment and earnings gains; employment and earnings gains are found across most groups of students; the residential and nonresidential programs are each effective for the youths they serve; Job Corps significantly reduces involvement with crime; and Job Corps has modest or no impacts on a range of other outcomes. Outcomes regarding whether Job Corps is a good investment whose benefits exceed costs are Job Corps is cost effective despite its high costs; benefits during the study period are modest; benefits should continue; and Job Corps is a good investment. (YLB)
- Published
- 2001
33. National Job Corps Study: Job Corps Applicants' Programmatic Experiences. Final Report.
- Author
-
Mathematica Policy Research, Princeton, NJ., Battelle Memorial Inst., Seattle, WA., Decision Information Resources, Inc., Houston, TX., Johnson, Terry, Gritz, Mark, and Dugan, Mary Kay
- Abstract
The National Job Corps Study was undertaken to obtain the information needed to assess the Job Corps' success in providing employment assistance to disadvantaged youths aged 16 to 24 years. The study's findings related to outreach and admissions (OA) and center characteristics and practices that appear to promote positive programmatic experiences for Job Corps applicants and participants were examined by using agency records for a randomly selected sample of youth who were eligible to enroll in Job Corps. The following items had significant impacts on applicants' programmatic experiences: (1) OA counselors' outreach and screening practices; (2) OA counselors' knowledge and experience level; (3) center operator type, size, and location; and (4) the strength of centers' vocational and academic programs. The particular vocational areas offered by centers and the range of vocations offered did not significantly affect students' programmatic experiences. Limited residential living facilities were associated with shorter lengths of stay, but exceptionally good facilities did not appear to promote longer lengths of stay. (Twenty-two tables/figures are included. The following items are appended: the study methodology; results for vocational choices of Job Corps students; results for applicant typologies; and a description of the methods used to estimate impacts for different Job Corps program experiences.) (MN)
- Published
- 2000
34. National Job Corps Study: The Impacts of Job Corps on Participants' Literacy Skills. Final Report. Research and Evaluation Report Series.
- Author
-
Mathematica Policy Research, Princeton, NJ., Battelle Human Affairs Research Centers, Seattle, WA., Decision Information Resources, Inc., Houston, TX., Glazerman, Steven, Schochet, Peter Z., and Burghardt, John
- Abstract
This report examines the extent to which Job Corps improves literacy and numeracy skills. Chapter I provides an overview of Job Corps and discusses key policy issues related to basic skills and study objectives. Chapter II addresses study design issues, interview response rates, and analytic methods used to estimate program impacts on literacy scores. Chapter III presents Job Corps impacts on three domains of literacy skills (prose, document, and quantitative) and provides a descriptive analysis, impact findings, and estimates of literacy for key subgroups. It reports that the typical youth Job Corps serves has lower functional literacy scores than the typical young adult, especially in the quantitative literacy domain. Job Corps' impacts on participants' functional literacy skills were positive in all three domains. Chapter IV discusses the extent to which estimated impacts on literacy skills are consistent with impact findings on key outcomes associated with literacy skills. A proposed theoretical framework is used to examine the link among educational attainment, labor market outcomes, and literacy skill scores. Empirical results on these relationships are presented: literacy levels are influenced by schooling and employment measures; and some association exists between literacy assessment scores and labor market outcomes, but a large residual variation in earnings remains. Appendixes contain 13 references, analysis of nonresponse, administration and scoring of the assessment, and supplementary tables. (YLB)
- Published
- 2000
35. Integrating Year-Round and Summer Employment and Training Services for Youth under the Workforce Investment Act: Technical Assistance Guide.
- Author
-
Westat, Inc., Rockville, MD. and Decision Information Resources, Inc., Houston, TX.
- Abstract
This document, which is intended to provide technical assistance to individuals responsible for integrating year-round and summer employment and training services for youth under the Workforce Investment Act (WIA), contains case studies of eight employment and training programs that have already or are in the process of integrating their summer and year-round youth services. Each case study contains information on some or all of the following topics: program objectives; status of development of a comprehensive youth strategy; recruitment and selection of participants; participant assessment and the process of developing individualized services; services provided; preparation for postsecondary education and/or unsubsidized employment; linkages with other programs; job development, job placement, and follow-up; outcomes achieved by participants; the program's overall effectiveness; and a program contact. The programs profiled are as follows: (1) Youth Start (which serves a 12-county service delivery area in Maine); (2) Work for Worcester's Youth (Worcester, Massachusetts); (3) Eagle Enterprises (Egg Harbor Township, New Jersey); (4) the Step-Up Program (Milwaukee, Wisconsin); (5) the Southeast Minnesota Year-Round Youth Employment Program (Rochester, Minnesota); (6) Pima County Community Services (Tuscon, Arizona); (7) the San Diego Workforce Partnership (San Diego, California); and (8) the Northwest Washington Workforce Development Council (Bellingham, Washington). (MN)
- Published
- 2000
36. National Job Corps Study: The Short-Term Impacts of Job Corps on Participants' Employment and Related Outcomes. Final Report. Report and Evaluation Report Series 00-A.
- Author
-
Mathematica Policy Research, Princeton, NJ., Battelle Human Affairs Research Centers, Seattle, WA., Decision Information Resources, Inc., Houston, TX., Schochet, Peter Z., Burghardt, John, and Glazerman, Steven
- Abstract
A national study estimated the short-term impacts of Job Corps (JC) on participants' employment and related outcomes during the 30 months after random assignment. Results for the short-term impact analysis were based on a comparison of eligible program participants randomly assigned to a program group (n=9,409) or a control group (n=5,977) that did not participate in JC. The analysis relied primarily on interview data. Findings indicated most program group participants stayed in JC for a substantial period of time; program group enrollees participated extensively in the core JC activities; differences in subgroups' JC experiences were small; JC substantially increased the education and training that program participants received; similar percentages of program and control group members were enrolled in education and training programs toward the end of the 30-month period; JC participation led to substantial increases in the receipt of General Educational Development and vocational certificates; JC generated positive earnings impact by 2 years after random assignment; and program group members secured higher-paying jobs with slightly more benefits. JC participation reduced receipt of public assistance benefits; significantly reduced arrest and conviction rates; had no impacts on the self-reported use of tobacco, alcohol, and illegal drugs; had no impact on family formation; and had no impact on mobility. (Appendixes include 20 references and supplementary tables.) (YLB)
- Published
- 2000
37. National Job Corps Study: Report on Study Implementation.
- Author
-
Mathematica Policy Research, Princeton, NJ., Battelle Memorial Inst., Seattle, WA., Decision Information Resources, Inc., Houston, TX., Burghardt, John, McConnell, Sheena, Meckstroth, Alicia, Schochet, Peter, Johnson, Terry, and Homrighausen, John
- Abstract
The National Job Corps Study was conducted in 1994-1996 to provide a thorough and rigorous assessment of the impacts of the Job Corps on key participant outcomes. To ensure that the study was well implemented, a study team from Mathematica Policy Research, Inc., (MPR) investigated outcome and admission (OA) procedures in each Job Corps region and developed proposed procedures for conducting random assignment tailored to each region. A four-step core random assignment process was implemented. During the sample intake period, MPR staff monitored sample buildup to ensure that the research sample was near target levels and determine whether the initial sample design parameters required adjustment. Job Corps staff implemented the random assignment procedures successfully over the 16-month sample intake period. Overall, the study had noticeable effects on key aspects of program operations but modest effects on OA counselors' activities and the composition of students coming to the program. (Fifteen tables/figures are included. The following items are appended: lists of special programs excluded from the Job Corps evaluation and data items needed for random assignment processing and monitoring; Job Corps study materials and forms; a chronology of random assignment implementation; and a list of processing steps performed by MPR before random assignment.) (MN)
- Published
- 1999
38. The Value of the Output and Services Produced by Students While Enrolled in Job Corps.
- Author
-
Mathematica Policy Research, Princeton, NJ., Battelle Human Affairs Research Centers, Seattle, WA., Decision Information Resources, Inc., Houston, TX., and McConnell, Sheena
- Abstract
The value of the output and services produced by students while enrolled in the Job Corps was estimated by analyzing data from a sample of 2 projects from each of 23 Job Corps centers. The projects were subjected to in-depth analysis based on independent-estimate and relative-productivity approaches. The following were among the key findings: (1) in 1 year, more than 1 million student-days are spent on all work projects in the Job Corps, with nearly 80% of student-days spent on vocational skills training (VST) projects and 20% spent on work experience (WE) projects; (2) students produce output worth $5.48 per hour spent on VST projects and $7.01 per hour spent on WE projects; (3) over 1 year, Job Corps students produce output worth more than $27 million while conducting non-center-serving projects, which is equivalent to $789 per student year; and (4) students working on center-serving projects reduce centers' operating costs by an estimated $280-$360 per student year, which is small when compared with the program operating costs of approximately $26,000 per student-year. (Twelve tables/figures are included. The following items are appended: a discussion of the weights used in the study; summaries of the work project studies; and standard errors of the estimates.) (MN)
- Published
- 1999
39. National Job Corps Study: Report on the Process Analysis. Research and Evaluation Report Series.
- Author
-
Decision Information Resources, Inc., Houston, TX., Mathematica Policy Research, Princeton, NJ., Battelle Human Affairs Research Centers, Seattle, WA., Johnson, Terry, Gritz, Mark, Jackson, Russell, Burghardt, John, Boussy, Carol, Leonard, Jan, and Orians, Carlyn
- Abstract
This report presents results of a process analysis that describes and documents Job Corps services and operations. Chapter one provides overviews of Job Corps, the national Job Corps study, and the process analysis. Chapter two describes the administrative structure of Job Corps and presents data on the geographic distribution and characteristics of key operating components. Chapters three to nine focus on these components of Job Corps: outreach and admissions, including counseling; vocational education, including occupational exploration program, vocational skills training projects, work experience program, and advanced training; academic education; residential living and health services, including student government and leadership and behavior management; center administration related to staffing and safety/security; organization and provision of Job Corps' placement services; and Job Corps performance management system and how it affects center operations. Chapter ten offers conclusions. Appendixes include data collection design and implementation and supplementary tables. (YLB)
- Published
- 1999
40. Evaluation of DVD and Internet Decision Aids for Hip and Knee Osteoarthritis: Focus on Health Literacy
- Author
-
Foundation for Informed Medical Decision Making
- Published
- 2015
41. Evaluation of a Decision Support Tool (PRIMA)
- Author
-
Patient-Centered Outcomes Research Institute, Foundation for Informed Medical Decision Making, and Jennifer Polinski, Assistant Professor and Epidemiologist
- Published
- 2015
42. The National Evaluation of Upward Bound. The Short-Term Impact of Upward Bound: An Interim Report.
- Author
-
Mathematica Policy Research, Washington, DC., Educational Testing Service, Atlanta, GA., Westat, Inc., Rockville, MD., Decision Information Resources, Inc., Houston, TX., Myers, David E., and Schirm, Allen L.
- Abstract
This report on the short-term effects of Upward Bound, a federal pre-college program designed to help economically disadvantaged students complete high school and gain access to post-secondary education, presents interim findings from the Longitudinal Effectiveness Study of Upward Bound based on data on approximately 2,800 students during the first year or two of high school. At present, there are more than 600 Upward Bound projects; they offer intensive instructional programs and are usually hosted by 2-year and 4-year colleges. The study found that: (1) Upward Bound has early positive impacts on students' educational expectations and academic course-taking; (2) students with lower educational expectations initially benefit more from Upward Bound; (3) Hispanic students initially benefit most from Upward Bound; and (4) many students (about 37 percent) who enter Upward Bound leave the program during the first year. After an executive summary and introductory chapter, Chapter 2 presents data on persistence in Upward Bound, and on the Upward Bound services offered. Chapter 3 details short-term impacts of Upward Bound, including the average impact of the program and groups benefitting most. The concluding chapter summarizes findings, compares them to previous findings, and draws implications for program improvement. Eight appendices provide additional detail on research methodology, data interpretation, and statistics. (Contains 18 references.) (DB)
- Published
- 1997
43. The National Evaluation of Upward Bound. Summary of First-Year Impacts and Program Operations. Executive Summary.
- Author
-
Mathematica Policy Research, Washington, DC., Educational Testing Service, Atlanta, GA., Westat, Inc., Rockville, MD., Decision Information Resources, Inc., Houston, TX., Myers, David E., and Moore, Mary T.
- Abstract
This monograph presents the executive summary of a study evaluating the first-year impacts and program operations of Upward Bound, a federal pre-college program designed to help economically disadvantaged students complete high school and gain access to post-secondary education. In 1996, 45,000 students participated in the program through projects offered by 601 grantees; the average cost per student was $3,800. Most students enter Upward Bound in ninth or tenth grade and participate in a multi-year program of weekly activities during the school year and an intensive summer program that simulates college. The study found two major impacts of Upward Bound--first, participating students expect to complete more schooling than similar students not in the program and, second, the program has a positive impact on the number of academic courses participants take. Other findings included: students who benefited most initially were those with lower academic expectations; Hispanic students appeared to benefit most from the program among racial/ethnic groups examined; the program showed no impact in the first year on participants' high school grades; many students left the program in the first year; and most Upward Bound projects focused on providing a rich and challenging program. (DB)
- Published
- 1997
44. Evaluation of Tech Prep System Development and Implementation in Texas Public Schools and Institutions of Higher Education. Final Report, 1994-95.
- Author
-
Decision Information Resources, Inc., Houston, TX.
- Abstract
In August 1993, a third-party evaluator examined tech prep system development and implementation in Texas public schools and institutions of higher education. The second year of the evaluation focused on the following aspects of Texas' tech prep system: current status of statewide implementation, secondary school counseling, professional development, and work-based learning. Significant progress was achieved in implementation of tech prep in Texas in the 1994-95 school year. Among the main conclusions of the evaluation were the following: (1) Texas' tech prep program appears to rate favorably on a national scale; (2) significant progress had been made since 1993-94 in many areas, including the numbers of schools and colleges participating in tech prep and offering approved programs/courses, leadership, clarity of roles, statewide transfer of articulated credit, strategic planning/marketing, and consensus regarding the definition of tech prep; (3) although most secondary school counselors were aware of tech prep, many were still not informed about the program's specifics; (4) professional development for educators should remain a continued focus because of educators' lack of familiarity with the program's specifics; and (5) although business/industry involvement in tech prep has increased, lack of business/industry participation statewide continues to be problematic. (Contains 16 tables and 35 references.) (MN)
- Published
- 1995
45. Follow-Up Survey of Participants in Preparing for Profit (PREP).
- Author
-
Informed Decision Services, Englewood, NJ.
- Abstract
Established as a pilot project in 1991, Preparing for Profit (PREP) was a four-session entrepreneurial training seminar offered by LaGuardia Community College/City University of New York, the New York Metropolitan Transportation Authority, and Coopers & Lybrand. PREP was designed to enhance opportunities for minority- and women-owned businesses to obtain certification by public agencies and, ultimately, to win contracts. In order to evaluate program outcomes, a survey was conducted of all 146 participants completing PREP in 1991. Highlighted findings, based on a 39% response rate, included the following: (1) over one-third of the respondents had been certified as a minority- or woman-owned business since taking part in PREP; (2) over one and a half times more respondents had received a government contract after PREP participation; (3) 65% of the respondents currently used brochures as a marketing strategy after the training versus 39% prior to participating in PREP; (4) almost 50% more reported using business slogans after the training; (5) 40% more prepared income statements, 34% more prepared financial statements, and 25% more prepared balance sheets after PREP participation; (6) 58% reported receiving additional benefits from PREP, such as networking (53%), receiving business from other participants (21%), purchasing products or services from participants (18%), participating in joint ventures with other participants (9%), or applying for contracts with other participants (7%); and (7) of those who had purchased equipment, 50% indicated that information provided in PREP helped them choose the right system. Appendixes include the survey questionnaire and comments from participants. (JSP)
- Published
- 1992
46. Debt Burden Facing College Graduates.
- Author
-
Westat, Inc., Rockville, MD., Decision Resources Corp., Washington, DC., and Wabnick, Richard
- Abstract
This report presents an analysis of the debt levels and debt burdens of recent college graduates, based on student-reported data from four national surveys taken over the period 1977 to 1986. The major findings of this study were the following: (1) one out of two 1986 college graduates had some undergraduate debt representing an increase from one out of three 1977 graduates; (2) the median debt level of baccalaureates with debt rose from $2,000 for 1977 graduates to $4,800 for 1986 graduates; (3) the median debt burden--the ratio of repayments to gross income--of 1986 graduates was 4.0% in the year following their graduation from college, down slightly from 5.2% for 1977 graduates; (4) only 6.5% of all 1986 graduates had a debt burden in their first year after graduation that exceeded 10% of gross income, down .2% from 1977 graduates; and (5) only 4.8% of 1986 graduates had debt but no first-year income at the time of the survey, up .1% from 1977 graduates. Appendices include an explanation of the debt burden assumptions used in the computations, various statistical breakdowns of education debt and debt burden for 1986 bachelor's degree recipients, and descriptions of federal education loan programs and debt burden. (GLR)
- Published
- 1991
47. Developing and Evaluating the Yorkshire Dialysis Decision Aid (YoDDA)
- Author
-
Kidney Cancer UK, National Health Service, United Kingdom, Baxter Healthcare Corporation, Foundation for Informed Medical Decision Making, and Hilary L Bekker, Senior Lecturer in Behavioural Sciences and Chartered Psychologist
- Published
- 2014
48. A Summary of State Chapter 1 Migrant Education Program Participation and Achievement Information 1987-88. Volume 1: Participation.
- Author
-
Decision Resources Corp., Washington, DC. and Henderson, Allison
- Abstract
This report summarizes the participation and achievement information provided by state educational agencies (SEAs) on the Education Consolidation and Improvement Act (ECIA) Chapter 1 Migrant Education Program, for the 1987-88 school year. The report provides information on the number of participants (by gender, year of birth, ethnic group, migrant status, and grade), the types of services provided, and the number of staff. Information is provided for both the regular and summer terms. The document presents the national participation and staffing information in three ways: (1) for 1987-88; (2) changes from 1986-87 to 1987-88; and (3) trend data from 1984-85 to 1987-88. In addition, Migrant participation patterns are compared to Chapter 1 Basic participation and total enrollment. A description of the methodology used to review the State Performance Report information for 1987-88 is presented in Appendix A. State level participation and staff information for the 1987-88 is provided in Appendix B, while Appendix C displays year-to-year changes by state from 1986-87 to 1987-88. Appendix D provides tabular descriptions of the other instructional services, other supporting services, and other staff reported by each state for both the regular and summer terms. Appendix E contains the reporting form and Questions and Answers for the ECIA Chapter 1, Migrant Program State Performance Reports. This document contains numerous data tables and figures. (KS)
- Published
- 1990
49. A Summary of State Chapter 1 Participation and Achievement Information for 1987-88.
- Author
-
Decision Resources Corp., Washington, DC., Sinclair, Beth, and Gutmann, Babette
- Abstract
This document summarizes the annual State Performance Reports for programs funded under Chapter 1 of the Education Consolidation and Improvement Act (ECIA), which have been submitted by State Education Agencies (SEAs) for the school years 1979-80 through 1987-88. These reports provide information on Local Education Agency (LEA) and State Agency Neglected or Delinquent (N or D) compensatory education programs. The reports detail the following information: (1) the number of educationally disadvantaged students served; (2) the demographic composition of these students; (3) staffing patterns; (4) the types of services received; and (5) the achievement rates in basic skills areas. The effect of the Aguilar v. Felton decision on nonpublic participation in LEA programs is noted. Statistical data are presented in 61 tables and 19 graphs. The following materials are appended: (1) a discussion of the research methodology; (2) participants by grade; (3) selected participation information by state, 1979-80 through 1987-88; (4) additional achievement information for 1987-88; (5) additional national achievement information, 1979-80 through 1987-88; and (6) national achievement information for 1979-80 through 1987-88 expressed in normal curve equivalents. (FMW)
- Published
- 1990
50. Public School Choice: Implications for Children with Handicaps. Revised.
- Author
-
National Association of State Directors of Special Education, Washington, DC., Decision Resources Corp., Washington, DC., and O'Reilly, Fran E.
- Abstract
This report describes recent gains in support for the movement to allow parents to choose the public school which their children attend. Reasons for the movement's growth and methods of implementing school choice plans are presented. The paper then focuses specifically on interdistrict school choice plans, identifying major aspects which are likely to affect special education programs and the ability of students with handicaps to participate in these types of choice programs. Five major issues are discussed: (1) responsibilities of the resident district; (2) criteria for student admission to nonresident school districts; (3) due process; (4) finance; and (5) transportation. The report discusses each of the major issue areas as they relate to special education and describes how each area has been addressed in five states with interdistrict choice plans (Arkansas, Iowa, Minnesota, Nebraska, and Ohio). A final section summarizes, by state, the provisions included in each state's legislation. An appendix reprints the school choice legislation of each of the five states. Includes two references. (JDD)
- Published
- 1990
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.