ESSA Update: Easy, Automated Reports for Your EdTech Expenditures and outcomes
Before summer starts in earnest, end-of-year reporting is one of the last items standing between teachers and some well-deserved time off. And, while the students will head to camps and teachers may get a break, district administrators have their hands full this summer preparing to make sure every student succeeds next year.
In fact, administrators have told us that the changes in federal and state programs, including the Every Student Succeeds Act (ESSA), feel like a constantly moving target. Luckily, ESSA actually provides significant flexibility to districts and states in how they use dollars, as long as they show evidence for their decisions.
LearnPlatform’s automated reports and research-driven framework for rapid evaluation directly align with ESSA guidelines for evidence, so administrators in districts of any size can make data-driven decisions AND comply with all federal reporting requirements easily — without exorbitant spending on third-party evaluators.
With Evidence Comes Flexibility
ESSA gives districts flexibility in their expenditures, when they provide evidence that supports their instructional technology decisions*. Organizations, including those applying for school improvement grants or using Title I funds, must demonstrate this evidence for funds eligibility.
There are four levels of evidence highlighted in the guidance:
- Strong: Experimental studies with large sample and positive effect
- Moderate: Quasi-experimental studies with large sample and positive effect
- Promising: Correlational studies with statistical controls and valid sample
- Demonstration: Well-designed logic model with efforts to identify effect
Unlike an unfunded mandate, Title I and Title IV dollars explicitly allow and encourage funds (up to 15%) to be used to manage and measure for this evidence. As stated in the Department of Education guidance on the use of funds for a provision of the ESSA, a Local Education Agency (LEA) can use funds "to purchase or create a system that improves the procurement and evaluation process for identifying solutions and implementations that match the context of the SEA or LEA” (page 35, paragraph 2). The guidance stipulates that funds can be used to support evidence gathering and analysis in order to support educational research and innovation.
Reporting Doesn’t Have to Be Overwhelming
A comprehensive edtech management platform allows learning organizations to manage all facets of their edtech ecosystem, as well as systematize every stage of management — from discovery to evaluation to contract and portfolio management and impact measurement. Easily implementing these assets and data into a single system also aligns to the broader integrated process for proven program selection and implementation, and equips curriculum and instruction leaders, superintendents and principals with consistent, shareable evidence on what’s being used and driving learning outcomes for students.
LearnPlatform’s automated research design aligns to evidentiary standards outlined to meet ESSA requirements, showing program effectiveness measurement with IMPACT 2.0 rapid cycle evaluation.
This reporting also meets state-level requirements, like those of LCAP in California and the Digital Learning Initiative in Utah. Lea(R)n’s research and analytics team works with dozens of organizations across the country to support hundreds of analyses on hundreds of edtech products, empowering schools, districts, states and networks to make informed decisions.
The end of the year is the best time for organizations to test drive IMPACT Analysis on an edtech product. Running an analysis meets ESSA requirements, helps you gauge product effectiveness and quickly offers evidence of outcomes measurement using the data you already have.