Daniel Willingham--Science & Education
Hypothesis non fingo
  • Home
  • About
  • Books
  • Articles
  • Op-eds
  • Videos
  • Learning Styles FAQ
  • Daniel Willingham: Science and Education Blog

A New Push for Science in Education in Britain

3/26/2013

 
Ben Goldacre is a British physician and academic, and is the author of Bad Science, an expose of bad medical practice that is based on wrong-headed science.  For the last decade he has written a terrific column by the same name for the Guardian.

Goldacre has recently turned his critical scientific eye to educational practices in Britain. He was asked by the British Department for Education to comment on the use of scientific data in education and on the current state of affairs in Britain. You can download the report here.

So what does Goldacre say?

He offers an analogy of education to medicine; the former can benefit from the application of scientific methods, just as the latter has.

Goldacre touts the potential of randomised controlled trails (RCTs). You take a group of students and administer an intervention (a new instructional method for long division, say) to one group and not to another. Then you see how each group of students did.

Goldacre also speculates on what institutions would need to do to make the British education system as a whole more research-minded. He names two significant changes;
  • There would need to be an institution that communicates the findings of scientific research (similar to the American "What Works Clearinghouse.")
  • British teachers would need a better appreciation for scientific research so that they would understand why a particular practice was touted as superior, and could evaluate themselves the evidence for the claim
Picture
I'm a booster of science in education
As someone who has written shorter and book-length treatments of the role that scientific research might play in education, I'm very excited that Goldacre has made this thoughtful and spirited contribution.

I offer no criticisms of what Goldacre suggests, but would like to add three points.

First, I agree with Goldacre that randomized trials allow the strongest conclusions. But I don't think that we should emphasize RCTs to the exclusion of all other sources of data. After all, if we continue with Goldacre's analogy to medicine, I think he would agree that epidemiology has proven useful.

As a matter of tactics, note that the What Works Clearinghouse emphasized RCTs to the near exclusion of all other types of evidence, and that came to be seen as a problem. If you exclude other types of studies the available data will likely be thin. RCTs are simply hard to pull off: they are expensive, they require permission from lots of people. Hence, the What Works Clearinghouse ended up being agnostic about many interventions--"no randomized controlled trials yet." Its impact has been minimal.

Other sources of data can be useful; smaller scale studies, and especially, basic scientific work that bears on the underpinnings of an intervention.

We must also remember that each RCT--strictly interpreted--offers pretty narrow information: method A is better than method B (for these kids, as implemented by these teachers, etc.) Allowing other sources of data in the picture potentially offers a richer interpretation.

As a simple example, shouldn't laboratory studies showing the importance of phonemic awareness influence our interpretation of RCTs in preschool interventions that teach phonemic awareness skills?

Picture
Second, basic scientific knowledge gleaned from cognitive and developmental psychology (and other fields) can not only help us to interpret the results of randomized trials, that knowledge can be useful to teachers on its own. Just as a physician uses her knowledge of human physiology to diagnose a case, a teacher can use her knowledge of cognition to "diagnose" how to best teach a particular concept to a particular child.

I don't know about Britain, but this information is not taught in most American schools of Education. I wrote a book about cognitive principles that might apply to education. The most common remark I hear from teachers is surprise (and often, anger) that they were not taught these principles when they trained.

Elsewhere I've suggested we need not just a "what works" clearinghouse to evaluate interventions, but a "what's known" clearinghouse for basic scientific knowledge that might apply to education.

Third, I'm uneasy about the medicine analogy. It too easily leads to the perception that science aims to prescribe what teachers must do, that science will identify one set of "best practices" which all must follow. Goldacre makes clear on the very first page of the report that's NOT what he's suggesting, but to the non-doctors among us, we see medicine this way: I go to my doctor, she diagnoses what's wrong, and there is a standard way (established by scientific method) to treat the disease.

That perception may be in error, but I think it's common.

Picture
I've suggested a different analogy: architecture. When building a house an architect must respect certain basic facts set out by science. Physics and materials science will loom large for the architect; for educators it might be psychology, sociology et al. The rules represent limiting conditions, but so long as you stay within those boundaries there is lots of ways to get it right. Just as physics doesn't tell the architect what the house must look like, so too cognitive psychology doesn't tell teachers how they must teach.

RCTs play a different role. They provide proof that a standard solution to a common problem is useful. For example, architects routinely face the problem of ensuring that a wall doesn't collapse when a large window is placed in it, and there are standard solutions to this problem. Likewise, educators face common problems, and RCTs hold the promise of providing proven solutions. Just as the architect doesn't have to use any of the standard methods, the teacher needn't use a method proven by an RCT. But the architect needs be sure that the wall stays up, and the teacher needs to be sure that the child learns.

I made one of my garage-band-quality videos on this topic.

There's more to this topic--what it will mean to train teachers to evaluate scientific evidence, the role of schools of education. Indeed, there's more in Goldacre's report and I urge you to read it. Longer term, I urge you to consider why we wouldn't want better use of science in educational practice.

Cone of learning or cone of shame?

2/25/2013

 
A math teacher and Twitter friend from Scotland asked me about about this figure.
Picture
I'm sure you've seen a figure like this. It is variously called the "learning pyramid," the "cone of learning," "the cone of experience," and others. It's often attributed to the National Training Laboratory, or to educator Edgar Dale.

You won't be surprised to learn that there are different versions out there with different percentages and some minor variations in the ordering of ac

Certainly, some mental activities are better for learning than others. And the ordering offered here doesn't seem crazy. Most people who have taught agree that long-term contemplation of how to help others understand complicated ideas is a marvelous way to improve one's own understanding of those ideas--certainly better than just reading them--although the estimate of 10% retention of what one reads seems kind of low, doesn't it?

If you enter "cone of experience" in Google scholar the first page offers a few papers that critique the idea, e.g., this one and this one, but you'll also see papers that cite it as if it's reliable.

It's not.

So many variables affect memory retrieval, that you can't assign specific percentages of recall without specifying many more of them:
  • what material is recalled (gazing out the window of a car is an audiovisual experience just like watching an action movie, but your memory for these two audiovisual experiences will not be equivalent)
  • the age of the subjects
  • the delay between study and test (obviously, the percent recalled usually drops with delay)
  • what were subjects instructed to do as they read, demonstrated, taught, etc. (you can boost memory considerably for a reading task by asking subjects to summarize as they read)
  • how was memory tested (percent recalled is almost always much higher for recognition tests than recall).
  • what subjects know about the to-be-remembered material (if you already know something about the subject, memory will be much better.
Picture
This is just an off-the-top-of-my-head list of factors that affect memory retrieval. They not only make it clear that the percentages suggested by the cone can't be counted on, but that the ordering of the activities could shift, depending on the specifics.



The cone of learning may not be reliable, but that doesn't mean that memory researchers have nothing to offer educators. For example, monograph published in January offers an extensive review of the experimental research on different study techniques. If you prefer something briefer, I'm ready to stand by the one-sentence summary I suggested in Why Don't Students Like School?: It's usually a good bet to try to think about material at study in the same way that you anticipate that you will need to think about it later.

And while I'm flacking my books I'll mention that When Can you Trust the Experts was written to help you evaluate the research basis of educational claims, cone-shaped or otherwise.

Forward>>

    Enter your email address:

    Delivered by FeedBurner

    RSS Feed


    Purpose

    The goal of this blog is to provide pointers to scientific findings that are applicable to education that I think ought to receive more attention.

    Archives

    July 2020
    May 2020
    March 2020
    February 2020
    December 2019
    October 2019
    April 2019
    March 2019
    January 2019
    October 2018
    September 2018
    August 2018
    June 2018
    March 2018
    February 2018
    November 2017
    October 2017
    September 2017
    August 2017
    July 2017
    June 2017
    April 2017
    March 2017
    February 2017
    November 2016
    September 2016
    August 2016
    July 2016
    June 2016
    May 2016
    April 2016
    December 2015
    July 2015
    April 2015
    March 2015
    January 2015
    September 2014
    August 2014
    July 2014
    June 2014
    May 2014
    April 2014
    March 2014
    February 2014
    January 2014
    December 2013
    November 2013
    October 2013
    September 2013
    August 2013
    July 2013
    June 2013
    May 2013
    April 2013
    March 2013
    February 2013
    January 2013
    December 2012
    November 2012
    October 2012
    September 2012
    August 2012
    July 2012
    June 2012
    May 2012
    April 2012
    March 2012
    February 2012

    Categories

    All
    21st Century Skills
    Academic Achievement
    Academic Achievement
    Achievement Gap
    Adhd
    Aera
    Animal Subjects
    Attention
    Book Review
    Charter Schools
    Child Development
    Classroom Time
    College
    Consciousness
    Curriculum
    Data Trustworthiness
    Education Schools
    Emotion
    Equality
    Exercise
    Expertise
    Forfun
    Gaming
    Gender
    Grades
    Higher Ed
    Homework
    Instructional Materials
    Intelligence
    International Comparisons
    Interventions
    Low Achievement
    Math
    Memory
    Meta Analysis
    Meta-analysis
    Metacognition
    Morality
    Motor Skill
    Multitasking
    Music
    Neuroscience
    Obituaries
    Parents
    Perception
    Phonological Awareness
    Plagiarism
    Politics
    Poverty
    Preschool
    Principals
    Prior Knowledge
    Problem-solving
    Reading
    Research
    Science
    Self-concept
    Self Control
    Self-control
    Sleep
    Socioeconomic Status
    Spatial Skills
    Standardized Tests
    Stereotypes
    Stress
    Teacher Evaluation
    Teaching
    Technology
    Value-added
    Vocabulary
    Working Memory