February 2014: Over the past 6 years the Centre for Effective Education (CEE) at Queen’s University Belfast has been developing expertise in the use of randomised control trials (RCT) in the evaluation of educational programmes. As you may be aware the CEE evaluated two programmes using this methodology for CDI in Tallaght called – Doodle Den and Mate Tricks.
Although RCTs may be more familiar to you in a medical context (e.g. drug trials) their use with educational programmes has a long history, in fact, the first educational trials can be traced back to the 1920’s. As an evaluation approach experimental methods increasingly fell out of fashion among the educational research community from around the 1960’s. However, in recent years RCT’s are once again being seen as a useful way of rigorously evaluating educational interventions. While RCTs still have their critics, their renewed use in education has been particularly prominent in the USA. Influenced by these developments across the Atlantic, The Atlantic Philanthropies, who commissioned the Doodle Den and Mate Tricks programmes, encouraged the use of RCTs to evaluate interventions in Ireland. As a result, Ireland has to a large extent led the way in terms of the renewed focus on this type of evaluation in educational research in Europe.
When we began the research in Tallaght over 5 years ago the CEE as a research team had considerable research experience and many of the underlying skills required to complete such evaluations, but applying such an experimental approach within a community setting such as Tallaght was something quite new to us. Reflecting on some of the key issues and experiences we now understand the central importance of working closely with all stakeholders (pupils, parents, teachers, service providers etc.) in the successful evaluation of an educational programme.
Educational RCTs are primarily concerned with the quantitative measurement of change in children’s lives. These changes are generally referred to as outcomes and their appropriate measurement is another key issue. It is crucially important that we are measuring the right things, in other words, those particular outcomes that the programme is trying to impact upon whether that is, children’s reading, writing, behaviour, school attendance or attitudes. Ideally those who have developed the programme have specified clear and realistic outcomes that the programme hopes to change, although often we find this is not the case and is important to spend time at the start of an evaluation working through these in detail with stakeholders. This involves clearly thinking through a process of change and how the programme can deliver the intended outcomes. Once these outcomes have been established, as evaluators it is then our job to find or create appropriate tests for measuring them.
Randomised trials, if carefully implemented, are particularly good for establishing whether a programme works (or not), but they tell us little about why outcomes may have changed. This aspect of work is something that is sometimes neglected, but is important for gaining a deeper understanding of the process of change. This can be done through detailed observations of the programme in action and exploring a range of stakeholder’s views.
Ultimately, our work with CDI in Tallaght has highlighted the real improvements in children’s literacy provided by the Doodle Den programme. This strong evidence is currently been used to promote the use of the Doodle Den for children elsewhere in Ireland and there has been considerable interest in the programme from further afield.
By Centre for Effective Education (CEE), Queen’s University Belfast.