Distinguishing Effective Evidence-Based Programs from Everything Else

The most recent issue (September 2014) is linked here.

All previous issues are linked here.

Who it’s for:  Policy officials, practitioners, researchers, and others interested in evidence-based policy.

What it is:  A periodic newsletter, summarizing key findings from rigorous evaluations – especially well-conducted randomized controlled trials (RCTs) – across all areas of social policy.

Well-conducted RCTs, where feasible, are regarded as the strongest method of evaluating program effectiveness, per evidence standards articulated by the National Academies,1 Congressional Budget Office,2 Institute of Education Sciences,3 U.S. Preventive Services Task Force,4 Food and Drug Administration,5 and other respected scientific bodies. Such studies have shown that many programs thought to be effective, based on expert opinion or preliminary evidence, actually are not. But they have also identified a few programs that are highly effective, producing meaningful gains against educational failure, poverty, crime, and other social problems.

In order to find instances of effectiveness, the Coalition systematically monitors the literature to identify all new RCTs – and other rigorous nonrandomized evaluations – published or posted on-line across all areas of social policy. Our RIGOROUS EVIDENCE newsletter includes (i) summaries of the few findings identified by our expert panel as meeting the highest (Congressional “Top Tier” or near Top Tier) evidence standards; as well as (ii) our staff-level summaries5 of promising findings that are not yet ready for Top Tier consideration (e.g., due to only short-term follow-up). We believe the latter findings – promising but not conclusive – are valuable for identifying those programs that merit testing in more definitive RCTs, so as to expand the number of programs proven to address important social problems.

What makes RIGOROUS EVIDENCE unique:

  • Its primary focus on well-conducted RCTs as the most reliable measure of program effectiveness. It recognizes that many RCTs fall short of “well-conducted,” containing flaws in design or implementation that weaken confidence in their findings, and notes any such limitations. It also reports noteworthy findings from high-quality nonrandomized studies (e.g., well-matched comparison-group studies showing particularly large effects).
  • Its wide scope (all areas of social policy) combined with sharp focus on findings that are of policy importance – because, for instance, they show or suggest that a program has a sizable effect on an important outcome (such as criminal arrests, employment/earnings, child abuse, high school graduation, teen pregnancy).
  • Its source – the nonprofit, nonpartisan Coalition for Evidence-Based Policy. The Coalition is a recognized leader in the evidence-based reform movement, as found in a recent independent assessment. The Coalition has no affiliation with any programs, enabling it to serve as an impartial, expert resource on rigorous evidence.

We welcome suggestions of studies to include in the newsletter:

Please contact Coalition Vice President David Anderson by email or by phone at 202-239-1248. We will fulfill any request for a review of an RCT, and provide informal comments to the requester. Whether we summarize it for RIGOROUS EVIDENCE is a judgment based on various factors.

If you wish to receive the RIGOROUS EVIDENCE newsletter by email, please email us with the word “subscribe” in the subject line.

1“Preventing Mental, Emotional, and Behavioral Disorders Among Young People:  Progress and Possibilities,” National Academies Press, 2009, recommendation 12-4, p. 371, linked here.

2 “CBO’s Use of Evidence in Analysis of Budget and Economic Policies,” Jeffrey R. Kling, Associate Director for Economic Analysis, November 3, 2011, page 31, linked here.

3 U.S. Department of Education, “Scientifically-Based Evaluation Methods: Notice of Final Priority,” Federal Register, vol. 70, no. 15, January 25, 2005, pp. 3586-3589, linked here.

4 U.S. Preventive Services Task Force, “Current Methods of the U.S. Preventive Services Task Force: A Review of the Process,” American Journal of Preventive Medicine, vol. 20, no. 3 (supplement), April 2001, pp. 21-35.

The Food and Drug Administration’s standard for assessing the effectiveness of pharmaceutical drugs and medical devices, at 21 C.F.R. §314.126, linked here.

6 While we strive to ensure that our staff-level summaries are accurate and balanced, readers should note that they have received less scrutiny than those reviewed by the expert panel for Top Tier or Near Top Tier consideration.