The fruits of the Centers for Public Health Preparedness (CPHP) program are evident throughout this special supplement to Public Health Reports. How they came about is a fascinating tale in its own right. As Baker and colleagues note in their accompanying commentary, the seeds of the program were planted by the Centers for Disease Control and Prevention's (CDC's) Public Health Practice Program Office more than a decade ago.1 The program itself sprouted a year before the events of fall 2001 ushered in an era in which preparedness emerged as a national security concern. The funding level and number of academic institutions participating in the program blossomed from $1.7 million and four centers in 2000 to $23.8 million and 21 centers in 2003.2,3 In full bloom by mid-decade, 27 CPHPs shared more than $27 million. A remarkable network of academic institutions and practice partner agencies assembled to deploy a spectrum of activities and services focusing on workforce preparedness, graduate public health education, innovative collaborations between academia and public health practice, and the advancement of new technologies. By the end of the decade, a faltering economy displaced national security at the top of the national agenda, although public health preparedness funding levels had already been declining during the second half of the decade. Provisions in the 2006 Pandemic and All-Hazards Preparedness Act4 broadened the functions of CPHPs to include research, resulting in the redirection of half of the program's resources toward preparedness research through new Preparedness and Emergency Response Research Centers (PERRCs) in 2008. Workforce training and practice partnership activities were to continue through 14 new Preparedness and Emergency Response Learning Centers (PERLCs), which were scheduled to debut in late 2010. Within the span of a single decade, the birth, maturation, demise, and rebirth of the program all took place, setting the stage for this supplement to attempt to capture the program's important impacts and lessons. This history reads like an exception to the adage “opportunity knocks but once” as academic institutions position themselves to build on the foundation laid in the previous decade. Everything was in place for the first decade of the 21st century to be the best decade ever for the public health workforce. The Public Health Functions Project had issued its report on the 21st Century Public Health Workforce,5 and the Health Resources and Services Administration (HRSA) had commissioned the first public health workforce enumeration study in two decades.6 And then things got even better! CDC initiated a program to fund academic institutions to train public health workers and quickly upped the ante, increasing the number of academic institutions funded from the initial four to more than two dozen. Even more significantly, bioterrorism funding to states and big cities directed that more than $100 million annually be spent on workforce education and training through the original bioterrorism grants program for states. Program aspirations immediately soared. The table was set for CPHPs to have a major impact, acting alone or with their practice partners, on public health workforce preparedness. Never before had HRSA and CDC acted in such a coordinated manner, nor had funding been specifically designated to support collaborations between schools of public health and their practice partners. Progress came swiftly. Core competency frameworks for public health workers were supplemented with competency formulations for public health emergency preparedness and response, epidemiology, informatics, and other public health specializations. In the eye of this mounting storm of workforce currents were academic centers for public health preparedness, now with the mission, resources, and partners to deliver on the promise of a better prepared public heath workforce. Successful activities were quickly generated. Competency frameworks provided structure for worker and workforce assessments, as well as interventions intended to address training needs.7–10 Training interventions took full advantage of new information and communication technologies to extend their reach.11,12 Schools of public health offered new degrees, academic certificate programs, student epidemiology response teams, and courses related to public health emergency preparedness and response.13,14 Pathways were blazed for collaboration with law enforcement and public safety agencies, school systems, courts, and organizations serving vulnerable populations.15 Many of the centers expanded partnerships with local and state public health departments, tribes, and other community organizations.16–19 Successful examples of many of these activities appear in this special supplement. With so much energy and innovation unleashed, why did the overall program fall short of offering the compelling evidence necessary to assure its sustainability? Some might argue that the national program lacked clear goals and measurable objectives. The approach taken was more like a thousand points of light emerging across the country rather than a powerful laser beam embodying specific common results aggregated at the national level. A major strength of the CPHP network was its ability to address the unique needs of state and local public health agencies and other practice partners. Since local circumstances and needs varied, the interventions to address those needs would also differ from place to place. Performance metrics across centers do not aggregate into an overall composite that can be used as compelling evidence of the network's impact. Others might argue that academic institutions, by nature, are prone to individualistic and entrepreneurial strategies and relationships, and don't mesh well with highly centralized and standardized operations, such as state and local public health agencies. All too often, academic institutions view their role as central, rather than supportive, in areas such as workforce education and training. For example, there may be an expectation that learners come to the program rather than the program reaches the learner in their place of work. There is little command-and-control thinking within academic institutions; instead, freedom of thought and action and decisions by consensus are the norm. And there is, at the end of the day, some question about the extent to which this program uniformly enhanced the role of public health practice within academia, although in certain institutions, the CPHPs formed a crucial focal point around which other practice activities emerged. The natural competition for diminishing resources may have been another influence on the outcome of the CPHP story. Preparedness funding increased rapidly through the middle of the decade, but then began to decline as general preparedness funding for state and local public health agencies was reduced to fund new federal initiatives. With education and training a relatively low comparative priority, states and localities reduced their efforts in this area and looked for CPHPs to help fill the gap. When CPHP funding was halved, both sides of these partnerships were forced to lower their expectations. These effects are vividly documented in the article by Richmond and her colleagues at CDC.3 Ultimately CDC and the Association of Schools of Public Health (ASPH) viewed a strengthened science base as sorely needed and not likely to emerge from the practice-oriented collaborations CPHPs maintained with public health agencies. The CDC-supported -PERRCs, established in 2008, reflect a response to this need for establishing the evidence base for preparedness. As Baker and colleagues note, the new PERLCs may also continue this focus with their emphasis on development of training based on nationally established competencies. Teachers, scholars, and mentors constantly admonish their students to be critical about what they read. Not everything in print is true, and not everything that is true is in print. Even the true stuff often fails to go beyond who and what to explain how and why. The published literature is biased toward presenting positive results and successful programs. What doesn't get done well doesn!t get published! From such a perspective, even this special issue merits cautious and critical review by readers as to whether it presents a balanced view of the accomplishments of the CPHP program. Not that the program didn't make significant contributions, but those accomplishments need to be balanced with what the program didn't do or could have accomplished. Otherwise, any successor program may be doomed to repeat the failures of the past. Intervention evaluations must look beyond what was done in order to understand why it worked and how an intervention might be successful in a different setting. Perhaps even more importantly, evaluations of why something didn't work shed even more light on factors related to whether an intervention is replicable and sustainable. In that light, the contributions to this issue tell only part of the story. Yet opportunity has once again come knocking at the door, and it is time to put yesterday's lessons to work as there may not be another opportunity soon. more...