Background: The teacher-driven, lecture-based approach to instruction may not be the best way for students to develop necessary skills including critical thinking, creativity, teamwork, and productive debate (Bransford, 1999). However, lecture remains the dominant instructional style in U.S. schools (Mehta & Fine, 2019). Teachers may be particularly inclined to rely heavily on lecture in Advanced Placement (AP) courses due to concerns that with any other instructional approach they will not be able to cover the content specified in the College Board's course-specific curriculum frameworks, and consequently, their students will not be sufficiently prepared for end-of-year examinations. In project-based learning (PBL) classrooms, teachers primarily play a facilitator role while students work alone and in groups on complex tasks organized around central questions leading to a final product (Hmelo-Silver, 2004). Recent research demonstrates PBL pedagogy can more effectively encourage students' deeper learning of knowledge and skills in standard classrooms relative to a predominantly lecture approach (Duke et al, 2020; Finkelstein et al, 2010; Harris et al, 2015). Research Question: We asked, "Is there a causal impact of teachers' opportunity to participate in a PBL for AP intervention for one year, relative to business-as-usual AP curriculum and instruction, on high school students' academic knowledge and skills as measured by their AP US Government (APGOV) or AP Environmental Science (APES) exam-taking and scores?" This question directly attends to the conference theme by focusing on an effective intervention demonstrated to increase chances of AP exam success among students within lower and higher household-income groups and for two subjects, government and science. Insight into the extent to which the intervention offer impacted students' AP exam performance can inform adoption choices, and justify time and resource investments. Setting: Teachers and their students were from five large school districts across the country. A higher proportion of the student sample, compared to typical AP exam-takers, was from low-income households. Four of five participating districts serve majority Black and Hispanic students, and three of five serve majority students from lower-income households. Participants: During the 2016-17 school year, 74 volunteering teachers across 68 schools--and their 3,645 students--participated in the study by teaching APGOV or APES in their randomized school (intent-to-treat sample). Among the 74 teachers' students, 43% were from lower-income households and 47% were Black or Hispanic. Intervention: University of Washington-based developers designed the curriculum. Year-long curriculum and instructional materials are now available with open access for APGOV, APES, and AP Physics; this study addressed the first two because the third was in development at the time of our study. Ongoing, job-embedded professional learning, provided by PBLWorks, included a four-day summer institute, four full days during the year, and on-demand virtual coaching support. Curriculum supports are course-specific, designed to align to the College Board's course curriculum frameworks for respective courses. However, the same design principles apply to both courses, so both versions of the intervention include similar resources. Research Design: We evaluated the efficacy of the intervention using a RCT with school-level randomization. Data Collection and Analysis: We collected covariate and outcome administrative data from participating districts and the College Board. We fit two-level Hierarchical Linear Modeling models, grouping students within schools, with district-fixed effects to account for blocked randomization within districts. Impact models included all covariates with absolute baseline standardized mean differences greater than 0.05 as well as those determined through automated covariate selection to improve model fit. We used structural equation modeling to investigate mechanisms. Findings/Results: In covariate-adjusted models, all estimates of AP performance were positive and significant. For all students in the sample (n=3,645), we found no overall exam-taking effect (ES=-0.009, p=0.95). We estimated 31.1% of the students in our sample who took the AP exam would have earned a qualifying score of three or higher on the 1-5 ordinal score scale without PBL, as compared to 35.1% with, a difference of 4 percentage points (ES=0.264, p=0.016). We predicted 37.2% of the exam-taking students in our sample (n=2,963) would have earned a qualifying score without PBL, as compared to 44.8%, a difference of 7.6 percentage points (ES=0.457, p=0.002). Earning qualifying AP scores can translate into college credit, and relates to enrolling and persisting in college (Smith, Hurwitz, & Avery, 2017). In the four districts with continuous AP score outcomes, the estimated effect sizes are significant for total scores (ES=0.192, p=0.009), as well as the multiple-choice (ES=0.188, p=0.009) and free-response (ES=0.181, p=0.012) subsection scores. We show results in Table 1 and Figure 1. The pattern was also positive within courses; within respective groups of students from lower- and higher-income households; in districts serving a majority of students from lower-income households; in districts serving a majority of students from higher-income households; and within each of the five participating districts. Compared to business-as-usual control teachers, treatment teachers with access to the AP PBL intervention placed greater emphasis on deeper learning objectives, more frequently used student-centered pedagogy and in ways their students felt were authentic and relevant, and less frequently lectured or relied upon explicit exam preparation. For the most part, teachers across courses sustained their use of the AP PBL approach throughout the year. However, effect mechanisms did not include foundational AP PBL teaching practices. Conclusions: Results apply to teachers within the participating districts who chose to enroll in the RCT. The five participating districts were not representative of all districts offering AP courses. In addition, the inextricability of the curriculum effect from the effect of professional learning supports means it is impossible to disentangle their separate influences on students' AP performance outcomes. Strengthening causal claims is the RCT design, as well as the statistical significance, magnitude, and robustness of estimated effect sizes across multiple covariate-adjusted sensitivity analyses. Weakening causal claims are high school-level attrition, and differences in results between unadjusted and adjusted models. Within-district estimates demonstrating similar effect sizes regardless of number of schools attrited may attenuate attrition concerns. The results support teacher-driven adoption of the approach in APGOV and APES courses, among districts with open-enrollment AP policies supporting PBL, and for students from lower- and higher-income households.