Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Using Student Achievement Data to M ...
Home arrow_forward_ios ... arrow_forward_ios Using Student Achievement Data to M ...
Information on IES-Funded Research
Grant Closed

Using Student Achievement Data to Monitor Progress and Performance: Methodological Challenges Presented by COVID-19

NCER
Program: Unsolicited
Award amount: $748,928
Principal investigator: Jonathan Schweig
Awardee:
RAND Corporation
Year: 2020
Award period: 2 years (09/01/2020 - 08/31/2022)
Project type:
Other
Award number: R305U200006

Purpose

The RAND Corporation and the Northwest Evaluation Association (NWEA) identified promising analytic methods for practitioners, policymakers, and researchers to adapt and respond to COVID-19-induced disruptions to student assessment programs. These methods are to help with decisions often based on student test scores regarding -

  • Day-to-day instructional and other student related decisions
  • School and teacher accountability systems
  • Applied education research and program evaluation

Project Activities

The project team aimed to identify the key decisions to be made and promising analytic methods to help make them when there is a lack of state assessment data for students. To do this, they reviewed the literature, interviewed researchers and practitioners, surveyed state education agencies (SEAs) and education grantmaking organizations, and hosted a technical advisory group. The researchers on the project team then examined the performance of the identified promising analytic methods using both simulation and application to 5 years of NWEA MAP (Measures of Academic Promise) Growth test scores in reading and math for 7 million kindergarten through 8th-grade (K-8) students in over 17,000 public schools. The simulations and applications tested each method and compared methods against one another and against the methods being used by SEAs under a variety of conditions introduced by COVID-19 disruptions. In addition, to understand how researchers conducting effectiveness studies on education programs responded to COVID-19 disruptions, the researchers also conducted interviews with 15 investigators conducting IES-funded efficacy studies.

Structured Abstract

Setting

Public schools across the U.S. serving students in K–8 and using the MAP Growth assessment.

Sample

The project team used assessment data from 5 school years (2015-16, 2016-17, 2017-18, 2018-19, 2019-20 and 2020-21), collected and stored in NWEA's Growth Research Database. NWEA tests over 7 million K-8 students in 17-19,000 US public schools each school year (representing 23 to 25 percent of the public schools serving K-8 students in the U.S.). Students were tested in all 50 states as well as the District of Columbia.

Research design and methods

The researchers used simulation studies and empirical applications to NWEA data to compare the analytical methods used when testing data were not available for school and district decision making. To understand how researchers conducting effectiveness studies on education programs responded to COVID-19 disruptions, the researchers interviewed 15 investigators conducting IES-funded efficacy studies.

Control condition

Not applicable

Key measures

The key measure was the NWEA's MAP Growth scores in mathematics and English language arts.

Data analytic strategy

For the quantitative analyses, the project team employed parametric simulations and empirical analysis through applications to MAP Growth data. The researchers used Monte Carlo simulations to evaluate the performance of proposed methods and drew on the MAP Growth data to conduct empirical analyses where possible to compare methods using real-world data. Qualitative analyses of interview data involved cross-case meta-matrices to understand prevalence and patterns among interview responses.

Key outcomes

Regarding day-to-day instructional and other student related decisions (Schweig et al. 2021):

  • Missing assessment data complicates course placement processes. Schools and systems used a variety of approaches to deal with missing assessment data (e.g., simple replacement, multiple replacement, and regression), and this typically plays a key role in course placement decisions, including imputation, and score prediction.
  • Consistent course placement decisions could be made using the three replacement strategies, although much depends on the district context.
  • Due to variation in school quality, assuming average school quality when using regression-based methods can either overestimate or underestimate some students' future achievement, and this misestimation is problematic for course placement decisions.
  • There is evidence of differential method performance based on student race and ethnicity and school poverty, and this can affect the regression-based results.

Regarding school and teacher accountability systems including incorporating COVID-19 vulnerability information in resource allocation decisions and how changes in school composition during the pandemic can influence school-level accountability measures (Schweig et al. 2022a, 2022b):

  • School poverty more strongly predicts performance and progress during the pandemic than pre-COVID-19 academic measures.
  • In elementary schools, pandemic vulnerability independently predicts performance and progress even when conditioning on poverty and pre-pandemic achievement.
  •  Of the indicators of poverty, the percentage of free and reduced-price lunch-eligible students is the strongest predictor of performance and progress during the pandemic.
  • Within and among districts, there was wide variability in the percentage of students who attended the same schools and participated in MAP Growth assessments over 2 academic years.
  • Participation in MAP Growth assessments was uneven in 2020–2021. In particular, students of color were less likely to have attended the same schools and participated in MAP Growth assessments over 2 academic years than were White students.
  • Historically higher achieving students who participated in assessments in a given year were generally more likely than their peers to have attended the same schools and participated in MAP Growth assessments over 2 academic years.
  • Schools serving high-poverty communities and communities vulnerable to COVID-19 had systematically fewer students attend the same school and participate in MAP Growth assessments over 2 academic years than other schools.

Regarding applied education research and program evaluation (Bush-Mecenas et al. 2023):

  • Researchers described three practical challenges in conducting efficacy research during the pandemic period: (1) issues with intervention feasibility caused by situational complexity, (2) difficulty with study recruitment, and (3) issues with data availability and concerns about data quality.
  • Researchers needed to modify study timelines.
  • Researchers addressed recruitment challenges by focusing on partnerships and allocating funding to support staffing and incentives.
  • Researchers struggled to strike a balance between the evaluations that were intended and those that could realistically be accomplished.
  • Researchers often made pivots and adaptations that addressed threats to the internal validity of their studies. These pivots and adaptations also had the unintended consequence of raising other threats to validity.
  • Concerns about generalizability and extrapolation were less of a priority for researchers during the pandemic.

     

People and institutions involved

IES program contact(s)

Allen Ruby

Associate Commissioner for Policy and Systems
NCER

Project contributors

Megan Rebecca Kuhfeld

Co-principal investigator

Andrew McEachin

Co-principal investigator

Louis Mariano

Co-principal investigator

Products and publications

ERIC Citations: Find available citations in ERIC for this award here.

Select Publications:

Bush-Mecenas, S., Schweig, J.D, Kuhfeld, M., Mariano, L. & Diliberti, M. (2023). Research, interrupted: Addressing practical and methodological challenges under turbulent conditions. Santa Monica, CA: RAND Corporation. WRA1037-1.

Kuhfeld, M., Diliberti, M., McEachin, A. Schweig, J.D. & Mariano, L.T. (2023). Typical learning for whom? Guidelines for selecting benchmarks to calculate months of learning. NWEA Research.

Schweig, J., Kuhfeld, M., Diliberti, M., McEachin, A., & Mariano, L. (2022b). Changes in School Composition during the COVID-19 Pandemic: Implications for School-Average Interim Test Score Use. RAND Research Report. RR-A1037-2.

Schweig, J., McEachin, A. & Kuhfeld, M. (December 16, 2020). Addressing COVID-19's Disruption of Student Assessment, Inside IES Research Blog.

Schweig, J., McEachin, A., Kuhfeld, M., Mariano, L., & Diliberti, M. (2021). Adapting Course Placement Processes in Response to COVID-19 Disruptions: Guidance for Schools and District. RAND Research Report. RR-A1037-1.

Schweig, J., McEachin, A., Kuhfeld, M., Mariano, L., & Diliberti, M. (2022a). Allocating resources for COVID-19 recovery: A comparison of three indicators of school need, Educational Assessment, 27(2), 152-169.

Supplemental information

Issue Examined: The COVID-19 pandemic disrupted state assessment programs, presenting methodological and decision-making challenges for applied education researchers, policymakers, and school administrators, who typically rely on such data to monitor student, school, and program progress and performance. The project team examined analytical methods being used to address the interruption in the assessment data when testing was not carried out.

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

Mathematics

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Blue 3 Placeholder Pattern 1
Statistical Analysis Report

2024 NAEP Mathematics Assessment: Results at Grade...

Author(s): National Center for Education Statistics (NCES)
Publication number: NCES 2024217
Read More
Zoomed in IES logo
Statistics in Brief

NAEP Mathematics 2024 State and District Snapshot ...

Author(s): National Center for Education Statistics (NCES)
Publication number: NCES 2024219
Read More
Zoomed in IES logo
First Look / ED TAB

TIMSS 2023 U.S. Highlights Web Report

Author(s): Catharine Warner-Griffin, Grace Handley, Benjamin Dalton, Debbie Herget
Publication number: NCES 2024184
Read More
icon-dot-govicon-https icon-quote