The “Magic” of the CERES Network
A central goal of CERES is to produce scientifically grounded interventions that will reduce growing inequalities in children’s learning and development. In addition to unequal access, contemporary children are experiencing a “new digital divide” whereby opportunities, experiences, and personalization of online resources are heavily stratified by family socioeconomic status, geography, and gender.
We now have more data than ever on children’s learning and development in digital spaces, but very little information on whether platforms or spaces track developments in brain and learning science or whether they lead to positive impacts. This is where the “magic” of the CERES network can happen as leading brain and developmental scientists connect with experts in HCI to optimize the fit between how children learn and the design and evaluation of digital spaces and platforms supporting them. Real world data does not yet equal real world evidence; however, the CERES network can help to close this gap. The network will scale and amplify evidence-based solutions by partnering early with leaders in industry and research to establish a robust evidence-base that can be evaluated in real time, with transparent reporting and metrics, and measurable indicators of impact. In addition to collecting and managing robust datasets, the CERES Network will interrogate the appropriate legal and ethical frameworks for data donation for scientific and educational purposes, leading to key contributions beyond the science discoveries that such donations are likely to provide.
Karl is an educator, entrepreneur, and advisor who has led four successful education innovation organizations, after teaching and education administration in the US and abroad. Most recently, he co-founded and led LearnPlatform, whose ground-breaking work in measuring edtech impact created a market category and industry standard for evidence. Karl now also leads K-12 strategy for Instructure, which LearnPlatform joined in 2022, the global education technology company committed to elevating student success, amplifying the power of teaching, and inspiring everyone to learn together.
Karl is a graduate of the University of North Carolina at Chapel Hill, where he was a James M. Johnston Scholar. He has been named a BMW Herbert Quandt Transatlantic Fellow, North Carolina Teaching Fellow, and Education Policy Fellow. Karl and his family make their home in Raleigh, North Carolina.
An interview with one of our industry partners - Karl Rectanus of LearnPlatform
So we know LearnPlatform is a mission-driven research organization. We’d love for you to share a little bit about your approach to evidence-based decision-making in Ed Tech.
I would be glad too. So LearnPlatform is a for-benefit research organization. We were launched in 2014 to help school districts, their leaders and their partners figure out which of these different technologies and interventions were actually helping students learn. We launched a platform that districts and states use to manage all of their education technology. And it includes a technology that we call “IMPACT™,” which is our rapid-cycle evaluation engine, integrating usage, student achievement, contract, and other data to create a third-party evaluation—a correlative, comparative, quasi-experimental or fully experimental analysis—and visualizations in a matter of minutes and hours, instead of months and years.
With IMPACT™ administrators can answer questions such as: “If we’re paying for it, do we use it and is it effective? In which situations and for which student groups does it appear to be having an effect on math scores? On other achievement? On attendance? On whatever metrics of success that we’d like to consider.”
Can you talk about what it means to you to translate research into practice?
Our founding researcher, Dr. Daniel Stanhope, is often quoted saying we bring “practical rigor” to decision-making. I think traditional research has taken too long, often been too expensive, and is published in journals that are not accessible for teachers and administrators. That makes it difficult to use traditional research to be able to make decisions. And so if we can support the application of rigorous processes at the speed of decision-making—which albeit is not really that fast in education, but it’s certainly faster than traditional research—then we have a much higher likelihood that we can help the entire market be more equitable.
Can you tell us a bit more about that intersection between evidence and efficiency? Because that seems like such an important piece of what you’re doing.
It really is critical. In almost every other sector of our lives, we are used to having data analytics to help inform decision-making. If I go for a walk in the morning, as I’m walking, I know my heart rate and pace compared to the previous day. And if I want, I can compare that to everyone else in my neighborhood. All of these different systems in our world have that, but education doesn’t. Administrators and teachers are often forced to look at data from last year’s students to inform budgeting or decisions for next year’s kids. That’s literally a three-year lag that we’re talking about. So the massive opportunity is to bring rigor, but also efficiency, to getting that information in the hands of people quickly.
With rigorous and efficient data analytics available, we don’t start making decisions and best guesses for kids that look like these kids—we make better decisions for these kids that impact what’s happening for them in their instructional practice and in their community that same school year! That drives learning in a much more efficient and effective way. And that saves a bunch of money too. So I think the biggest value proposition is to actually start making better decisions for the students we’re teaching right now, not just the students that will look like these kids in a few years.
So on that note, can you talk a little bit about how this evidence-based decision-making that’s done with efficiency helps close those equity gaps and improve outcomes for students? What’s a real life example of how that could work?
Absolutely. So Granite County is one of the top five or six largest districts in Utah. A few years ago, between Christmas and New Year’s, one of their administrators decided to analyze the end of quarter assessments for a number of math interventions. Using LearnPlatform they were able to do subgroup analyses of different cohorts. Over the course of a couple of hours, he was able to see that one of the interventions was having an outsized positive impact for their English language learning students—for their non-native English speakers. And as they came back to the second semester, they shared this data with their math team. So in that second semester, they made three changes.
One, they started implementing that intervention in schools that had a high number of English-as-a-second-language students. Secondly, they deprioritized and looked at other interventions for students that were amongst their highest achieving band, because that band was not seeing the same effect that the English language learners were. The third thing they did was they didn’t buy as many licenses the next year, because they only needed that specific intervention for a subset of their students. January/February is budgeting season. So they recognized that they could purchase basically about half as many licenses as they had before and lower their bill. By the way, the intervention provider was ecstatic with the results, especially since they put their savings towards professional development, specifically for teachers within schools with high populations of English language learners. Not kids that look like those kids, those kids! There are plenty of other examples on the provider side where, for example, you’re seeing ESSA-aligned, rapid-cycle studies on everything from high-dosage tutoring, to social emotional learning interventions, to communications technologies with parents and families. All these studies are evaluating if, when, and where they have positive impacts on outcomes of meaning for students, teachers, and families. Having clear data in a timely manner allows administrators and teachers to implement tools differently than they have in the past—a past when all that data has been opaque.
After the recent acceleration in Ed Tech, where do you see things going next?
Well, learning is now tech-enabled. I think the thing that we see coming is a need and a focus on building the capacity of both institutions and the people to use that technology, and its corresponding evidence, to its full power. Most folks who have been in the field for some time, have never had access to the data that they could now have access to. And frankly, they’re unprepared.
For example, in a situation where you used to host staff meetings to get feedback on a specific intervention, you might give everyone two weeks after the meeting to report back. These days, you can have all that information before you even get to the meeting, in a more equitable, extensive, and objective way. It changes the way systems work when you can have usage data and evidence before you get to budgeting conversations. Finally, the other thing we’re seeing is the leapfrogging in machine learning and artificial intelligence. If our school systems’ capacity-building doesn’t catch up, it will be overwhelmed by the pace of innovation.