BY JOHN BUSHbigstock Doubtful Young Woman Wearing T 186280348

Imagine you’re the head of a high school math department.  Every year, you face a challenge: a group of students will enter from middle school struggling with basic math skills.  You know that these students won’t just struggle in math but will find it difficult to access the curriculum in other subject areas, too.  If you can help them close the achievement gap with their peers, they’ll be much more likely to have success throughout high school and life. 

Of course, there are any number of programs or approaches you could choose to meet this challenge.  You might know about them from colleagues in your professional network, from the system or association you're part of, or from your online community. How do you know which one to choose?

The value of evidence

School leaders face challenges like this regularly.  To choose wisely in the face of such challenges, they should look to high-quality evidence to support their professional judgement.  Robust evidence can give them confidence that the approaches they select have the best chance of working for their students.  If they don’t look to rigorous evidence, they run the risk that they are wasting time and resources on things that are ineffective or, at worst, may actually do harm. 

For example, one pull-out reading program in the UK raised students’ reading levels over the course of a term, but similar students who just stayed in class actually improved their reading levels more.  Schools using this program would have improved their outcomes by keeping students in class, rather than pulling them out for the program.  Only a well-designed study picked this up, though. 

Further afield, a recent Australian study published in The Lancet showed that a school-based intervention designed to reduce teenage pregnancy rates actually increased them!  So, when educators think about evidence-informed or evidence-based practice, they need to think carefully about the rigour of the evidence they consider.

Evidence resources

One resource that can help them with this is the Teaching and Learning Toolkit, a free online summary of rigorous global research into educational approaches ranging from arts participation and feedback to reducing class size and repeating a year.  The Toolkit was developed by England’s Sutton Trust and Education Endowment Foundation and is now being adapted for many other jurisdictions.  (There may even be an Ontario version soon.)  The organisation I worked for until recently, Evidence for Learning (E4L), hosts the Toolkit in Australia.

E4L hopes The Toolkit provides a good introduction for educators who want to inform their decisions with strong evidence.  For each approach, The Toolkit provides three key pieces of information: (1) How many extra months’ progress students make in a year; (2) How much the approach costs; and (3) How secure the evidence is.  Dedicated pages for each approach give further detail, including key things to think about if a school is considering implementing an approach. 

The Toolkit offers a good guide to average effects, spread right around the world.  Unfortunately, there is not always good evidence about particular programs or approaches schools might be considering.  The developers of a program may have anecdotal data (or great testimonial quotes in their brochure) suggesting that the program works, and the school down the road may say that it worked for their students.  Even the best-evaluated programs may only have before-and-after data or correlational evidence.  Often, this is the best evidence a school leader can find to suggest a program might be effective for their students. 

Addressing the evidence gap

Efforts are underway around the world to address this gap.

E4L is trying to address the gap in Australia through the Learning Impact Fund, which aims to identify promising Australian educational programs, fund them to deliver at a new level of scale, and commission rigorous, independent evaluations, typically in the form of a randomised controlled trial (RCT).  The power of an RCT is in allowing researchers to say that a program caused a particular outcome, controlling for all other factors.  With this level of rigour involved, school leaders can be confident in programs that show good effects.  (If you would like to read more, I have written about the promise of RCTs, and their appropriate limits, in “Systems that Learn,” a piece for the Social Ventures Australia Quarterly.)

The Learning Impact Fund builds on work that has been happening internationally for the last ten to fifteen years.  In that time, both the US and England have undertaken national programs of RCTs, with publicly available results.  The US Department of Education’s Institute for Education Sciences publishes reports on specific programs through its What Works Clearinghouse.  Despite the seeming bluntness implied in ‘what works,’ the site allows filtering across a number of demographic categories, so that school and district leaders can see how well the research base might apply in their own contexts. Similarly, in England, the Education Endowment Foundation has committed to publishing plain-English summaries of reports on the trials it has commissioned.  It categorises the evidence on its site according to ‘school themes’ they have chosen in conjunction with teachers.

Of course, resources like these are not a panacea.  But they can help teachers and school leaders make decisions that are informed the best available evidence about what has worked and what hasn’t.  If you’re that head of a math department, looking to the evidence can give you more confidence that you’re giving those struggling kids the best chance to succeed in your school and in life.

 

ABOUT THE AUTHOR

John Bush recently became the General Manager, Education at the Paul Ramsay Foundation. He was formerly the Associate Director of Education at Social Ventures Australia and part of the leadership team of Evidence for Learning. In this role, he managed the Learning Impact Fund, a new fund that builds rigorous evidence about Australian educational programs. John has more than 15 years’ experience in the education and non-profit sectors as a classroom teacher and organisational leader. He managed the SVA Education Dialogue in 2014 and 2015 and the Growing Great Teachers project. 

From 2010 to 2014, he was Director of Learning and Development at High Resolves, with accountability for curriculum design, recruitment and development of teaching staff, and evaluation. During those years, High Resolves grew from working with 15 schools to working with more than 100 schools and 20,000 students annually in every state and territory.