Meta-analysis is a quantitative synthesis methodology aimed at systematically searching and screening relevant studies and integrating the findings from the eligible single studies to generate more robust and efficient estimates. Meta-analysis shows the magnitude and consistency of the overall effect size under investigation across studies. When the effect is consistent, it can then be generalized to a larger population. Reflecting this important utility, there has been a meteoric rise in the number of meta-analysis publications (Sutton & Higgins, 2008). Furthermore, meta-analysis is increasingly seen as a method to combat the current reproducibility crisis in preclinical and clinical trial research (Begley & Ellis, 2012; Ioannidis, 2005) and, more generally, as an important tool for open research practices (Cooper & VandenBos, 2013; Cumming, 2014; Eich, 2014; Open Science Collaboration, 2012). For these reasons, findings from meta-analyses weigh heavily in evidence-based decision making. Meta-analysis studies, however, are not without limitations, some of which may be inherent to a method that relies on aggregated data (AD) from published studies (Cooper & Patell, 2009). In addition, like all methodologies, meta-analysis must be applied properly to yield valid conclusions. The present review was motivated by a recent meta-analysis by Foxcroft, Coombes, Wood, Allen, and Almeida Santimano (2014) for the Cochrane Database of Systematic Reviews that evaluated the effectiveness of motivational interviewing (MI) interventions for heavy drinking among adolescents and young adults up to age 25. Their study included 66 randomized controlled trials of individuals between the ages of 15 and 25, of which 55 studies were quantitatively analyzed for alcohol outcomes. Foxcroft et al. concluded that: There are no substantive, meaningful benefits of MI interventions for the prevention of alcohol misuse. Although some significant effects were found, we interpret the effect sizes as being too small, given the measurement scales used in the studies included in the review, to be of relevance to policy or practice (p. 2). Foxcoft et al. further stated that the quality of evidence was low to moderate and, consequently, the reported effect sizes in their meta-analysis may be overestimated. The resulting press coverage declared “Counseling Has Limited Benefit on Young People Drinking Alcohol” (ScienceDaily news, 2014, August) and “Counseling Does Little to Deter Youth Drinking, Review Finds” (HealthDay news, 2014, August). Given the potential impact that Foxcroft et al. (2014) may have on intervention and policy development, as well as clients' understanding of their treatment options, it is important to critically examine the study's methodology and conclusion. The quality of evidence is just as important for meta-analysis reviews as for single studies when making recommendations to clinicians in the field (see the Grading of Recommendations Assessment, Development and Evaluation [GRADE] approach; Guyatt et al., 2008). In discussing Foxcroft et al. for the present article, we utilize available reporting guidelines for meta-analysis studies, such as the Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA; Moher, Liberati, Tetzlaff, Altman, & The PRISMA group, 2009) of randomized controlled trials. The PRISMA has been adopted by many journals and organizations, including the Cochrane Collaboration, all BMC journals, and PloS One. The Meta-Analysis Reporting Standards (MARS) has been adopted by the American Psychological Association (APA) as part of the Journal Article Reporting Standards (JARS) by the APA Publications and Communications Board Working Group (2008). A new open access journal Archives of Scientific Psychology by APA (Cooper & VandenBos, 2013) further elaborates the MARS to help authors to improve the reporting of systematic reviews and meta-analyses. We also utilized other tutorial articles and textbooks (e.g., Borenstein, Hedges, Higgins, & Rothstein, 2009).