Daniel Willingham--Science & Education
Hypothesis non fingo
  • Home
  • About
  • Books
  • Articles
  • Op-eds
  • Videos
  • Learning Styles FAQ
  • Daniel Willingham: Science and Education Blog

Why job interviews don't work

10/21/2013

 
My colleague, Tim Wilson, has long advocated that the psychology department at the University of Virginia stop interviewing potential graduate students or job applicants.
Picture
We conduct unstructured interviews, as most departments do, meaning the candidate meets with an individual for twenty or thirty minutes and chats.

You do end feeling as though you have a richer impression of the person than that gleaned from the stark facts on a resume. But there's no evidence that interviews prompt better decisions (e.g., Huffcutt & Arthur, 1994).

A new study (Dana, Dawes, & Peterson, 2013) gives us some understanding of why.

The information on a resume is limited but mostly valuable: it reliably predicts future job performance. The information in an interview is abundant--too abundant actually. Some of it will have to be ignored. So the question is whether people ignore irrelevant information and pick out the useful. The hypothesis that they don't is called dilution. The useful information is diluted by noise.

Dana and colleagues also examined a second possible mechanism. Given people's general propensity for sense-making, they thought that interviewers might have a tendency to try to weave all information into a coherent story, rather than to discard what was quirky or incoherent.

Three experiments supported both hypothesized mechanisms.

The general method was this. 76 students at Carnegie Mellon University served as interviewers. They were shown the academic record of a fellow student who they would then interview.  (The same five students served as interviewees throughout the experiment.)

The interviewers were to try to gain information through the interview to help them predict the grade point average of the interviewee in the next semester. The actual GPA was available so the dependent measure in the experiment was the accuracy of interviewers' predictions.

The interviewers were constrained to asking yes-or-no questions. The interviewee either answered accurately or randomly. (There was an algorithm to produce random "yeses" or "nos" on the fly.) Would interviewers do a better job with valid information than random information?

It's possible that limiting the interview to yes or no questions made the interview artificial so a third condition without that constraint was added, for comparison. This was called the natural condition.

The results? There was evidence for both dilution and for sense-making.

Dilution because interviewers were worse at predicting GPA than they would have been if they had used previous GPA alone. So the added information from the interview diluted the useful statistical information.

Sense-making because ratings made after the interview showed that interviewers generally agreed with the statement "From the interview, I got information that was valuable in making a GPA prediction."

There were no differences among the accurate, random, and natural conditions on these measures.

It's possible that the effect is due, at least in part, to the fact that interviewers themselves pose the questions. That makes them feel that answers confirm their theories about the interviewee.

So in a second experiment researchers had subjects watch a video of one the interviews conducted for the first experiment, and use that as the basis of their GPA prediction. All of the results replicated.

Keep in mind, what's new in this experiment is not the finding that unstructured interviews are not valid. That's been long known. What's new is some evidence as to the mechanisms: dilution and sense-making.

And sense-making in particular gives us insight into why my colleagues in the psychology department have never taken Tim Wilson's suggestion seriously.

Reference:
Dana, J., Dawes, R., & Peterson, N. (2013) Belief in the unstructured interview: The persistence of an illusion. Judgement and Decision Making, 8,  512-520.

Huffcutt, A. I. & Arthur, W. Jr. (1994). Hunter and Hunter (1984) revisited: Interview validity for entry-level jobs. Journal of Applied Psychology, 79,  184-190.

Hal
10/21/2013 04:03:18 am

One thing this discussion overlooks is that many people do job interviews not to predict how successful the candidate would be at the job if hired/admitted, but because they quite rationally want to predict how much they (the interviewer) would like hanging around with and interacting with the candidate over the years. I suspect that interviews are decent predictors of that outcome variable, but whether or not they are, this whole literature does not speak to this question at all.

dan willingham
10/21/2013 10:25:10 pm

well. . . the extent to which those sorts of initial impressions are valid is contested, see e.g.
Gray, H. M. (2008). To what extent, and under what conditions, are first impressions valid. First impressions, 106-128.
I have to say that's always what *I* thought interviews might be good for, especially if someone is off-the-scale weird.

Turadg Aleahmad
10/21/2013 04:07:13 am

Doesn't seem fair to use as a baseline an actual objective measure of what's being predicted. Job interviews would be much less important if the candidate were to be doing the same thing as they did in their previous jobs. Do the authors believe that one's prior job performances are as observable and predictive as an average of past course grades?

Rotokan Imanguadela
10/21/2013 05:51:29 am

So, on the basis of some interviews with undergrads, we toss away the benefits of interviewing? I'd love to see some of the personalities in your department, should you adopt this model (granted, I know nothing of the current personalities).

dan willingham
10/21/2013 10:30:25 pm

Rotokan--as mentioned briefly above, the unreliability of interviews is not just from studies with undergrads. The main benefits of interviews seems to be that they make people feel better by virtue of having happened.

Sherwood Botsford
10/21/2013 06:04:46 am

I question the opening statement, "The information on a resume is limited but mostly valuable: it reliably predicts future job performance."

Possibly correct if one is filling a standardized job from a pool of experienced practitioners, but for filling entry level jobs, or jobs where the general job requirements are not heavy on office and keyboard skills, I would disagree.

Example: Other than telling me frequency of job changes, a resume for a derrick hand, a carpenter, a farm labourer isn't going to tell me much. For those, I'm going to phone his references. Does he take drugs? Does he get along? Does he have a good work ethic? Can he tell a good joke.

I hire high school kids to help on my farm. I no longer interview them. I bring them to the farm in batches of 3-5, give them a tour explaining what sorts of things we do. They are invited back the next afternoon after school to work. I pay them for that time. Some don't show. The remainder work with me on some task for two hours. At the end of that time I tell them which ones can come back.

I can see how resume writing skill and GPA would correlate well. But I don't find that GPA correlates with work ethic, willing to get dirty, resourcefulness when confronted with 500 feet of baling string wrapped around the rototiller

Dan Willingham
10/21/2013 10:31:53 pm

Sherwood: no argument. that claim is too broad. More appropriate would be "a resume has at least some chance of containing reliable predictors."


Comments are closed.

    Enter your email address:

    Delivered by FeedBurner

    RSS Feed


    Purpose

    The goal of this blog is to provide pointers to scientific findings that are applicable to education that I think ought to receive more attention.

    Archives

    April 2022
    July 2020
    May 2020
    March 2020
    February 2020
    December 2019
    October 2019
    April 2019
    March 2019
    January 2019
    October 2018
    September 2018
    August 2018
    June 2018
    March 2018
    February 2018
    November 2017
    October 2017
    September 2017
    August 2017
    July 2017
    June 2017
    April 2017
    March 2017
    February 2017
    November 2016
    September 2016
    August 2016
    July 2016
    June 2016
    May 2016
    April 2016
    December 2015
    July 2015
    April 2015
    March 2015
    January 2015
    September 2014
    August 2014
    July 2014
    June 2014
    May 2014
    April 2014
    March 2014
    February 2014
    January 2014
    December 2013
    November 2013
    October 2013
    September 2013
    August 2013
    July 2013
    June 2013
    May 2013
    April 2013
    March 2013
    February 2013
    January 2013
    December 2012
    November 2012
    October 2012
    September 2012
    August 2012
    July 2012
    June 2012
    May 2012
    April 2012
    March 2012
    February 2012

    Categories

    All
    21st Century Skills
    Academic Achievement
    Academic Achievement
    Achievement Gap
    Adhd
    Aera
    Animal Subjects
    Attention
    Book Review
    Charter Schools
    Child Development
    Classroom Time
    College
    Consciousness
    Curriculum
    Data Trustworthiness
    Education Schools
    Emotion
    Equality
    Exercise
    Expertise
    Forfun
    Gaming
    Gender
    Grades
    Higher Ed
    Homework
    Instructional Materials
    Intelligence
    International Comparisons
    Interventions
    Low Achievement
    Math
    Memory
    Meta Analysis
    Meta-analysis
    Metacognition
    Morality
    Motor Skill
    Multitasking
    Music
    Neuroscience
    Obituaries
    Parents
    Perception
    Phonological Awareness
    Plagiarism
    Politics
    Poverty
    Preschool
    Principals
    Prior Knowledge
    Problem-solving
    Reading
    Research
    Science
    Self-concept
    Self Control
    Self-control
    Sleep
    Socioeconomic Status
    Spatial Skills
    Standardized Tests
    Stereotypes
    Stress
    Teacher Evaluation
    Teaching
    Technology
    Value-added
    Vocabulary
    Working Memory