Goldacre has recently turned his critical scientific eye to educational practices in Britain. He was asked by the British Department for Education to comment on the use of scientific data in education and on the current state of affairs in Britain. You can download the report here.
So what does Goldacre say?
He offers an analogy of education to medicine; the former can benefit from the application of scientific methods, just as the latter has.
Goldacre touts the potential of randomised controlled trails (RCTs). You take a group of students and administer an intervention (a new instructional method for long division, say) to one group and not to another. Then you see how each group of students did.
Goldacre also speculates on what institutions would need to do to make the British education system as a whole more research-minded. He names two significant changes;
- There would need to be an institution that communicates the findings of scientific research (similar to the American "What Works Clearinghouse.")
- British teachers would need a better appreciation for scientific research so that they would understand why a particular practice was touted as superior, and could evaluate themselves the evidence for the claim
I offer no criticisms of what Goldacre suggests, but would like to add three points.
First, I agree with Goldacre that randomized trials allow the strongest conclusions. But I don't think that we should emphasize RCTs to the exclusion of all other sources of data. After all, if we continue with Goldacre's analogy to medicine, I think he would agree that epidemiology has proven useful.
As a matter of tactics, note that the What Works Clearinghouse emphasized RCTs to the near exclusion of all other types of evidence, and that came to be seen as a problem. If you exclude other types of studies the available data will likely be thin. RCTs are simply hard to pull off: they are expensive, they require permission from lots of people. Hence, the What Works Clearinghouse ended up being agnostic about many interventions--"no randomized controlled trials yet." Its impact has been minimal.
Other sources of data can be useful; smaller scale studies, and especially, basic scientific work that bears on the underpinnings of an intervention.
We must also remember that each RCT--strictly interpreted--offers pretty narrow information: method A is better than method B (for these kids, as implemented by these teachers, etc.) Allowing other sources of data in the picture potentially offers a richer interpretation.
As a simple example, shouldn't laboratory studies showing the importance of phonemic awareness influence our interpretation of RCTs in preschool interventions that teach phonemic awareness skills?
I don't know about Britain, but this information is not taught in most American schools of Education. I wrote a book about cognitive principles that might apply to education. The most common remark I hear from teachers is surprise (and often, anger) that they were not taught these principles when they trained.
Elsewhere I've suggested we need not just a "what works" clearinghouse to evaluate interventions, but a "what's known" clearinghouse for basic scientific knowledge that might apply to education.
Third, I'm uneasy about the medicine analogy. It too easily leads to the perception that science aims to prescribe what teachers must do, that science will identify one set of "best practices" which all must follow. Goldacre makes clear on the very first page of the report that's NOT what he's suggesting, but to the non-doctors among us, we see medicine this way: I go to my doctor, she diagnoses what's wrong, and there is a standard way (established by scientific method) to treat the disease.
That perception may be in error, but I think it's common.
RCTs play a different role. They provide proof that a standard solution to a common problem is useful. For example, architects routinely face the problem of ensuring that a wall doesn't collapse when a large window is placed in it, and there are standard solutions to this problem. Likewise, educators face common problems, and RCTs hold the promise of providing proven solutions. Just as the architect doesn't have to use any of the standard methods, the teacher needn't use a method proven by an RCT. But the architect needs be sure that the wall stays up, and the teacher needs to be sure that the child learns.
I made one of my garage-band-quality videos on this topic.
There's more to this topic--what it will mean to train teachers to evaluate scientific evidence, the role of schools of education. Indeed, there's more in Goldacre's report and I urge you to read it. Longer term, I urge you to consider why we wouldn't want better use of science in educational practice.