How Rapid-cycle Evaluation Can Help School Districts Understand the Value of Their EdTech

Gathering Feedback on EdTech Tools to Guide Decision Making
31.png
Share

Every day, district administrators are researching, vetting, budgeting for, procuring and evaluating edtech for their students and teachers. And over the past few years, the question of what works and doesn't work has become critical – this is a good thing, particularly when educators are asking key questions in their own contexts, like:

  • Are we getting the full value from our district’s purchase of product X?

  • Are my students academically benefitting from spending X amount of time per week using product Y?

  • Is product A improving student outcomes on B local assessment?

  • Is product X or product Y more effective for our Z group of students?

To evaluate these types of questions, educators can use rapid-cycle evaluation (RCE) to generate practical, relevant evidence and to make more data-informed decisions about their edtech tools. Unlike a traditional evaluation, RCE is iterative and formative, and can reveal actionable insights to administrators more quickly (in as little as weeks). The evidence helps districts understand what’s working best for which students and teachers in their contexts – this point is key. While studies do exist on the effectiveness of a given edtech product, they offer more general insights that may or may not apply to a district’s students and teachers. By running RCEs, such as with LearnPlatform by Instructure’s rapid-cycle evaluation engine, district administrators can more efficiently assess the impact of their digital tools across a variety of their own local, personalized contexts (i.e., specific grade levels, gender identity, ethnicity, free- and reduced-lunch levels, etc.).

Here are the basics of the rapid-cycle evaluation process using the example research question “What is the impact of Product X on student’s winter 2020 math scores?”

  1. Formulate a strong research question: “What is the impact of Product X on students’ winter 2020 math scores?”

  2. Identify and collect data to answer this question.

  3. Prepare data for analysis: this could include provider usage data, student achievement data, license cost and more.

  4. Run your rapid-cycle evaluation, using an RCE tool.

  5. Review findings, considering areas where Product X seems to work well and where it doesn’t, with whom and under what conditions.

  6. Share and decide what actions to take based on success points. Per our example research question: if we found that Product X was impacting outcomes in certain grade levels but not others, we could refine teacher professional development and improve a given tool’s implementation process.

Then… repeat! You could repeat the process with Spring 2021 math scores, for example. This is the core of the RCE process: iteration!

RCEs empower educators to better understand what edtech is and isn’t working in their contexts, digging in beyond usage data and putting them in control rather than hiring an external evaluator. With the accelerated improvement of and changes in edtech tools, this credible, research-backed process of testing and retesting allows districts to bring to the surface data and insights that aid current students. 

Want to learn more about rapid-cycle evaluation and how it offers a pragmatic approach to determining edtech effectiveness? Read the complete white paper.

Click here if you’re ready to see how rapid-cycle evaluations can work for your organization.

Discover More Topics:

Stay in the know

Don't miss a thing – subscribe to our monthly recap and receive the latest insights directly to your inbox.