Enterprise Solutions
Understanding Children
Using evidence to help build and evaluate good ideas in education technology

Freya Lucas
Apr 05, 2020
Save
As researchers, we care that our educational systems improve, support all learners, and are grounded solidly in research evidence. But how do we work with stakeholders like educational technology startups to support effective use of that evidence? Researchers and practitioners worry about this, because we care about evaluating and scaling good ideas. By ‘scaling’ we mean adjusting and improving good ideas as they are rolled out and used.
Some common ways that people think about how we build evidence and scale innovations include:
- taking approaches tested in controlled settings and implementing them
- looking for ‘success stories’ and trying to copy lessons from them and
- taking a systematic approach to analyse context for places to change and evaluating these changes, the Improvement Method.
Research on how we use evidence in policy and practice (and policy practice) can help inform us when we try to work with startups and other stakeholders on education projects. Professor of Politics and Public Policy at the University of Stirling in the UK, Paul Cairney, compares the three approaches in the table below.

Three approaches to evidence-based policy-making
Emulate ApproachIn much work in education, we are looking to implement programs or technologies in contexts using an emulation approach; copying tested interventions. In our teaching that can also result in coming at research from a top down perspective, using key studies and methods but with a disconnect from local needs and context.
But these interventions are critiqued for their simplicity in the education context because they imply that interventions occur in a vacuum rather than in a complex context where we’ve already got lots of interventions going on. We might be evaluating a program that has already been implemented, and often our implementation process doesn’t follow this linear model.
Storytelling ApproachThe push back against the emulation approach is sometimes to instead focus very heavily on local context and storytelling approaches. This approach respects the expertise of professionals - which is important - but can result in key lessons not being distilled and shared, idiosyncratic ‘hit or miss’ practices, and ad hoc improvement cycles that may be driven by particular interests.
In the edtech space, much of the evaluation conducted by providers is based on testimonials. Although these can be useful, they’re typically not going to get at deeper issues of learning or help us evaluate our work.
Improvement MethodsSo, then, Improvement methods have been adopted in education systems, for example explicitly by the Carnegie Foundation, an independent research and policy centre in the US, and arguably in other forms such as Research Practice Partnerships (which are collaborative, long-term relationships between researchers and practitioners, designed to improve problems of practice in education) and other design based research approaches. Because these approaches work closely with practitioners to connect theory and real-world problems, they attempt to avoid ‘transmissive’ communication (one way communication) of research.
Our UCL EDUCATE project
At UCL (University College London) – which Simon recently visited while on sabbatical – the EDUCATE project has been created to help build a stronger evidence base in the EdTech sector. It uses this kind of approach. The approach is visualised through the ‘golden triangle’ connecting EdTech companies, entrepreneurs and start-ups with first-class business trainers, experts and mentors.

The UCL EDUCATE project worked with 252 small to medium-sized enterprises (max 250 employees,<£5m annual turnover) in 12 cohorts between 2017-19. The idea was to get EdTech creators, educators, investors and policy makers working together to understand what “works for learners and how to use technology to serve its users effectively.” As the program developed, it shifted from more general introductions to research methods and established research knowledge, to greater recognition that the nature of evidence is both varied and serves different purposes for enterprises at different stages of development.
The EDUCATE programme avoided the issue of transmissive or emulation-based research by building capacity in educational technology enterprises to conduct their own research, using theories of change to generate practical, robust, research. The aim, then, isn't just to translate research into practice, or implement outcomes from RCTs, but to try and move from storytelling about products, to an improvement mindset.
UTS Implementing Learning Analytics
In the work we've been conducting at the University of Technology Sydney we’ve taken a kind of improvement based approach, by looking at existing teaching practices, and seeking to augment those practices, rather than simply dropping in a new technology without understanding the context, or with a requirement for a particular type of teaching for it to be used. Our focus is improvement-oriented innovation. This approach is intended to improve adoption and support existing good practices by learning from them and to amplify them through the technology.
We believe it is important, when we think about the role of new technologies and approaches in education, to consider the way we use evidence. Understanding the different approaches - implementation, storytelling, or improvement – and how they work to achieve impact can be invaluable to all stakeholders.



This article was originally published on EduResearch Matters. Read the original article.


















