ESSA Evidence

Evidence-Based Strategies

Every Student Succeeds Act Evidence-Based Solutions

Evidence-Based Programs

After your district has identified local needs, it is time to determine which evidence-based strategies will best serve your student population. The chart below aligns to ESSA's four evidence categories and outlines the types of research studies completed on Savvas programs. 

For a closer look at the four evidence categories in ESSA, view the Evidence-Based Requirements Explained tab above.

ESSA Evidence

Reading

Strong

Moderate

Promising

Demonstrates a Rationale

Research Study

 

  •  
  •  
  •  

 

  •  
  •  
  •  

 

  •  
  •  
  •  

 

  •  
  •  
  •  

 

 

  •  
  •  
  •  
  •  
  •  

 

  •  
  •  
  •  
  •  
  •  

 

  •  
  •  
  •  
  •  
  •  

 

  •  
  •  
  •  
  •  
  •  

 

 

Mathematics

Strong

Moderate

Promising

Demonstrates a Rationale

Research Study

 

  •  
  •  
  •  
  •  
  •  
  •  
  •  

 

  •  
  •  
  •  
  •  
  •  
  •  
  •  

 

  •  
  •  
  •  
  •  
  •  
  •  
  •  

 

  •  
  •  
  •  
  •  

 

 

  •  
  •  
  •  

 

  •  
  •  
  •  

 

  •  
  •  
  •  

 

  •  
  •  
  •  

 

 

Professional Services

Strong

Moderate

Promising

Demonstrates a Rationale

Research Study

 

  •  

 

  •  

 

  •  

 

  •  

 

Evidence-Based Requirements Explained

A Closer Look at Evidence-Based Criteria

ESSA emphasizes "evidence-based" approaches that have demonstrated statistically significant positive effect on student outcomes. ESSA identifies four levels of evidence: strong evidence, moderate evidence, promising evidence, and evidence that demonstrates a rationale. The levels do not correlate to the strength of student outcomes. Rather they define the study criteria.

 
How should a school or district determine which intervention will best serve student needs?

The first step to select an evidence-based intervention is to conduct a local needs assessment to: 1) identify local needs and/or root causes; and 2) assess the local capacity to implement the intervention.

The following questions come from the Department of Education’s Guidance to use when reviewing a program’s evidence.

  • What do the majority of studies on this intervention find? Does the intervention have positive and statistically significant effects on important student or other relevant outcomes, or are there null, negative, or not statistically significant findings?
  • Were studies conducted in settings and with populations relevant to the local context (e.g., students with disabilities, English Learners)?
  • How can the success of the intervention be measured?
Are all schools required to use evidence-based programs?

No. Only Title I Schools identified for either comprehensive or targeted support or improvement must implement at least one intervention that is evidence-based. (E-26 of the January 2017 Title I Accountability FAQ)

However, reviewing a program’s evidence can help schools determine if an intervention is likely to be successful in improving student outcomes with their student population.

Do all interventions purchased with federal funds need to be evidence-based?

No. The requirement to purchase interventions based on strong, moderate, or promising evidence only applies to Section 1003(a) school improvement funding. (E-24 of the January 2017 Title I Accountability FAQ)

Why do various research groups' report strength of evidence differently?

Research study results/evidence varies because researchers are asking multiple questions. For example, in addition to "To what extent did student achievement increase?," we also want to know "To what extent was the program implemented well?" and "How can we isolate the effect of the program itself?" Another example question could be "To what extent did student achievement increase for different types of learners?"

In our research reports, we analyze student assessment scores adjusted means. This means we introduce covariates (other factors that might influence a student’s performance) into our analyses. For example, a teacher that implements a program with a high fidelity of implementation (implementing the program the way it was intended) will see higher significant positive gains than a teacher who does not implement the program as designed.

By introducing the covariates we are able to isolate the effect of the program itself and determine if it is having a positive significant effect higher than the control or comparison group. By using this method of analyses (i.e., the use of covariates) these programs are meeting ESSA's requirements of Strong evidence. Other research groups, such as the BEE and WWC, conduct analyses on the unadjusted means (student mean assessment scores) to determine if a program has a positive significant effect greater than the control or comparison group.

 

Resources to research evidence:

Savvas Research Team

Research & Efficacy

At Savvas, we’re committed to ensuring that our products and services deliver positive learner outcomes. Our Research Team conducts formative and summative research that directly informs the development of PreKindergarten through Grade 12 instructional programs this includes third-party validation research. We work in collaboration with educators, students, authors, and developers to apply research-based principles to both product and user experience design. We measure a program's impact through scientific studies and trials to evaluate how well it meets the needs of users of all abilities and achievement levels.

Research Goals and Direct Applications

  • Educate product developers on the purpose, methods, and impact of research on the design and function of products and services
  • Capture and interpret learner and customer needs, behaviors, and opinions
  • Evaluate the impact of Savvas PreK-12 solutions on learners
  • Facilitate collaboration between our users and developers

Continuous Research on Our Products

At every stage of a product’s lifecycle—from initial idea to the retirement of a product—we embed efficacy and research activities. These activities help us understand, define, and demonstrate how a product impacts learner outcomes.

Product Research Lifestyle
 

Get Involved! Join Savvas' PreK-12 Research Panel

We’re rethinking education at every step. Help us design new educational curriculum and courseware by participating in research studies. Let’s improve education together! Sign up now.

 

Student Research

Parent/Guardian Register your child today for this real-life learning experience.

Educator Register to share your point-of-view and impact education.

 
 

Every Student Succeeds Act (ESSA)