Daniel Willingham--Science & Education
Hypothesis non fingo
  • Home
  • About
  • Books
  • Articles
  • Op-eds
  • Videos
  • Learning Styles FAQ
  • Daniel Willingham: Science and Education Blog

Kristof on marginalized professors--he's partly right

2/16/2014

 
Today in the New York Times Nick Kristof writes that university professors have “marginalized themselves.” We have done so, he suggests, by concerning ourselves with very specialized topics that are removed from practical realities. (Old joke: the secret to academic success is to dig an intellectual trench so narrow and so deep that there is only room for one.)

The second part of the problem, Kristof suggests, is that academics write inaccessible prose, isolating their knowledge from others.  He quotes approvingly Harvard historian Jill Lapore, who says academics have created “a great, heaping mountain of exquisite knowledge surrounded by a vast moat of dreadful prose.”

Kristof’s solution is that academics do more writing for the public, on topics of practical concern. To make this change possible, universities would need to change the systems by which they evaluate faculty for promotion.

I think he’s partly right.

Kristof did not distinguish between faculty in Arts & Sciences and those in professional schools such as law, medicine, education, and engineering. These latter have practical application embedded in their mission and I think are therefore more vulnerable to his charges.

I started writing about the application of cognitive science for teachers exactly because I thought that too many teachers were not learning this information in their training at schools of education.

But for typical Arts & Sciences faculty, application is not part of the mission. I think two factors render impractical Kristof’s suggestions that it be part of the mission. I’m a scientist, so I’ll write from that perspective, and won’t claim that the following applies to the humanities.

Universities and the professors they employ are best seen as part of a larger system that includes government and private industry. The seminal document envisioning that system was written by Vannevar Bush in 1945. Bush was the director of the Office of Scientific Research and Development during World War II, through which virtually all of the scientific research for the war effort was funneled.

It was plain to all that science had played a lead role in the war. The Federal government had funded scientific research at an unprecedented scale, but what was the government role to be in the coming peace? President Roosevelt asked Bush to write a report on the matter.

Bush argued research can either be basic (“pure science,” which boils down to describing the world as it is) or applied (research in service of some practical goal). He argued for two points: First, that basic research lies behind the success of much practical research; e.g., the Manhattan project was a grand practical application made possible by advances in basic physics research. Second, applied research would inevitably crowd out basic research for funding because it offers short term gains.

Bush concluded that the Federal Government should continue its funding of science in peacetime, and that it should focus on basic research. Industry could fund research and development for application, and it was reasonable to expect that industry would do so. The federal investment was justifiable via the pay-off in economic productivity. That’s how the National Science Foundation was born.

Basic research has been housed primarily in the university system. That’s our role in the system. We’re not really here to work on applied problems. If a drug company wants to know the latest findings from molecular biology, they should hire a molecular biologist who will do the translation.

This arrangement actually makes a lot more sense than academics trying to do it.

Translation is more than explaining technical matters in everyday terms. It requires knowing how to exploit the technical findings in a way that serves the practical goal. For example, in education you can’t just take findings from cognitive science and pop them into the classroom, expecting kids will learn better. You need to know something about classrooms to understand how the application might work.

It makes more sense for the translators to be close to the site of the application because application can take so many forms. Cognitive science has applications throughout industry, the military, health care, education, and beyond. You really need to be embedded in the locale to understand the problem that the basic science is meant to solve.

So that’s why so many academics, when asked why they don’t make their work more accessible to the general public, say “that’s not my job.” We might add (and this is relevant to Kristof’s second point) that most of us are not very good at describing what we do in non-technical terms. It's a different skill set. Adding “writes well” to the criteria for promotion won’t get much traction among scientists. (The technical language that comes with any specialization adds to the problem, of course.)

But again, I think Kristof’s blade is much sharper when applied to university schools that claim a mission which includes practical application. Schools of Ed., I’m looking at you.

Teachers shouldn't need to learn neuroscience

6/4/2012

 
This article from Education Week suggests that teachers ought to learn neuroscience.

That strikes me as a colossal waste of teachers' time.

The offered justification is that a high percentage of teacher's hold false beliefs about the brain, and thus ought to be "armed" to evaluate claims that they encounter in professional development sessions, the media, etc.

But it takes an awful lot of work for any individual to become knowledgeable enough about neuroscience to evaluate new ideas. And why would it stop at neuroscience? One could make the same case for cognitive psychology, developmental psychology, social psychology, sociology, cultural studies, and economics, among other fields

Further, this suggestion seems like unnecessary duplication of effort. What's really needed is for a few trusted educators to evaluate new ideas, and to periodically bring their colleagues up to date.

In fact, that's how the system is set up. But it's not working.

First, the neuro-myths mentioned in the article ought to be defused during teacher training. Some programs do so, I'm sure, but most appear not to be doing a good enough job. It's certainly true that textbooks aimed at teachers don't do enough in this regards. Learning styles, for example, go unmentioned, or perhaps get a paragraph in which the theory is (accurately) said to be lacking evidence. Given the pervasiveness of these myths, schools of education ought to address the problem with more vigor.

Second, there is virtually always someone in the district central office who is meant to be the resource person for professional development: is this PD session likely to be legit, or is this person selling snake oil?  If teachers are exposed to PD with sham science, the right response, it seems to me, is not to suggest that teachers learn some neuroscience. The right response is outrage directed at the person who brought the knucklehead in there to do the PD session.

Third, it would make perfect sense if professional groups helped out in this regard. The Department of Education has tried with the What Works Clearinghouse and with it's various practice guides. These have had limited success. It might be time for teachers to take a try at this themselves.

Teachers don't need to learn neuroscience, or better put, teachers shouldn't need to learn neuroscience--not to be protected from charlatans. Teachers need to learn things that will directly help their practice. Charlatan protection ought to come from institutions: from schools of education, from district central offices, and (potentially) from institutions of teachers' own creation.

Here's why education research looks silly, Rick Hess

3/28/2012

 
Every year as the AERA convention approaches, Rick Hess writes a column poking fun at some silly-sounding titles in the program. Hess's point seems to be "Is any of this really going to help kids learn better?" (That's my summary, not his.)

I respect Hess, but I think he misses the more interesting point here. Hess's real beef, I suggest, is not with the AERA, but with schools of education, and with all education researchers.

Putting researchers from very different disciplines--history, critical theory, economics, psychology, etc.--in one school because they all study "education" sounds like a good idea. The problem is that it doesn't lead to a beautiful flowering of interdisciplinary research. Researchers ignore one another.

Why? Because these researchers start with different assumptions. They set different goals for education. They have different standards of evidence. They even have different senses of what it means to "know" something. So mostly they don't conduct interdisciplinary research. Mostly they ignore one another.

No, the Foucault crowd is not going to improve science education in the next ten years. The wheel on which the Humanities turns revolves much more slowly and less visibly than the cycle of the sciences. I admit I only dimly understand what they are up to, but I nevertheless believe they have a contribution to make.

But the fault lies not just with schools of education for sticking these varied researchers in one building.

A perhaps more significant problem is that there is little sense among education researchers that their particular training leads to expertise well suited to addressing certain problems and ill-suited to other problems. I think that education researchers would be smart to stake out their territory "We have methods that will help solve these problems."

Too often we forget our limitations. (I named this blog "Science and Education" to remind myself that, although I'll be tempted, I should not start mouthing off about policy, but should leave that to people like Rick who understand it much more deeply.) When the charter school affiliated with Stanford was in trouble a year or two ago, how many education researchers lacked an opinion? And how many of those opinions were really well informed?

Education research would look less silly if all of us made clear what we were up to, and stuck to it.

    Enter your email address:

    Delivered by FeedBurner

    RSS Feed


    Purpose

    The goal of this blog is to provide pointers to scientific findings that are applicable to education that I think ought to receive more attention.

    Archives

    January 2024
    April 2022
    July 2020
    May 2020
    March 2020
    February 2020
    December 2019
    October 2019
    April 2019
    March 2019
    January 2019
    October 2018
    September 2018
    August 2018
    June 2018
    March 2018
    February 2018
    November 2017
    October 2017
    September 2017
    August 2017
    July 2017
    June 2017
    April 2017
    March 2017
    February 2017
    November 2016
    September 2016
    August 2016
    July 2016
    June 2016
    May 2016
    April 2016
    December 2015
    July 2015
    April 2015
    March 2015
    January 2015
    September 2014
    August 2014
    July 2014
    June 2014
    May 2014
    April 2014
    March 2014
    February 2014
    January 2014
    December 2013
    November 2013
    October 2013
    September 2013
    August 2013
    July 2013
    June 2013
    May 2013
    April 2013
    March 2013
    February 2013
    January 2013
    December 2012
    November 2012
    October 2012
    September 2012
    August 2012
    July 2012
    June 2012
    May 2012
    April 2012
    March 2012
    February 2012

    Categories

    All
    21st Century Skills
    Academic Achievement
    Academic Achievement
    Achievement Gap
    Adhd
    Aera
    Animal Subjects
    Attention
    Book Review
    Charter Schools
    Child Development
    Classroom Time
    College
    Consciousness
    Curriculum
    Data Trustworthiness
    Education Schools
    Emotion
    Equality
    Exercise
    Expertise
    Forfun
    Gaming
    Gender
    Grades
    Higher Ed
    Homework
    Instructional Materials
    Intelligence
    International Comparisons
    Interventions
    Low Achievement
    Math
    Memory
    Meta Analysis
    Meta-analysis
    Metacognition
    Morality
    Motor Skill
    Multitasking
    Music
    Neuroscience
    Obituaries
    Parents
    Perception
    Phonological Awareness
    Plagiarism
    Politics
    Poverty
    Preschool
    Principals
    Prior Knowledge
    Problem-solving
    Reading
    Research
    Science
    Self-concept
    Self Control
    Self-control
    Sleep
    Socioeconomic Status
    Spatial Skills
    Standardized Tests
    Stereotypes
    Stress
    Teacher Evaluation
    Teaching
    Technology
    Value-added
    Vocabulary
    Working Memory