Extra
May 20, 2015

Canvassers Study Has Been Retracted

Ira writes:

Last month we did a story about canvassers who’d invented a way to go door to door and, in a 22-minute conversation, change people’s minds on issues like same sex marriage and abortion rights. We did the story because there was solid scientific data published in the journal Science – proving that the canvassers were really having an effect. Yesterday one of the authors of that study, Donald Green, asked Science to retract the study. Some of the data gathered by his co-author seems to have been faked.

Our original story was based on what was known at the time. Obviously the facts have changed. We’ll update today as we learn more.

The apparent fakery was discovered by researchers at UC Berkeley and Stanford who tried to replicate the findings in the original study. How they figured it out is a great story in itself.

The person who did the bulk of the work on the original study is a graduate student at UCLA named Michael LaCour. He needed a way to figure out people’s attitudes before canvassers came to their doors and for many months afterwards.

So he got them to sign up for an online survey. Or anyway, that's what he said he did. In the study’s supplementary materials, he wrote that he paid them:

Respondents were paid $10 upon initial enrollment .... Individuals were offered $2 (per referral) to refer their friends and family to participate in the survey panel. In order to encourage participation in follow up surveys, respondents were paid $5 per follow-up survey.


It was a lot of people. 9,507. They’d answer fifty questions on a variety of issues. Only two concerned same sex marriage. That way, they wouldn’t suspect that’s what the survey was about.

Then LaCour told the canvassers at the Leadership Lab at the Los Angeles LGBT Center, okay, these people are in the survey, go to their houses and talk to them.

So. The UC Berkeley and Stanford researchers wanted to replicate what LaCour did. They started their research two weeks ago and of course, like LaCour, they started with an online survey.

But they noticed something funny. They couldn’t get as many people to respond to their online survey as LaCour had. Nowhere near as many. They were getting a fourth of what he’d gotten.

What were they doing wrong? They wanted to figure that out. This is from the report they wrote about their investigation:

Hoping we could harness the same procedures that produced the original study’s high claimed response rate, we attempt to contact the survey firm LaCour had privately claimed to have worked with. We ask to speak to the staffer at the firm with whom LaCour had claimed to have spoken in May 2013 during the planning of Study 1 in LaCour and Green (2014). The survey firm claimed they had never worked with LaCour and that they had never had an employee with the name of the staffer we believe LaCour had worked with. The firm also denied having the capabilities to perform many aspects of the recruitment procedures LaCour had indicated they had performed.


Then these researchers – their names are David Broockman and Joshua Kalla – went back to the data and noticed “irregularities.” Irregularities in this case means: the data was just too good. The changes in people’s attitudes, for instance, were distributed perfectly normally among the respondents, in a way that looked fishy. “Not one respondent out of thousands provided a response that meaningfully deviated from this distribution.”

But if LaCour hadn’t actually done surveys, where did the numbers come from? Broockman and Kalla found studies that seemed like they might be the source, and ran some numbers to show how LaCour might’ve arrived at his results.

On Sunday, they informed LaCour’s co-author Donald Green of what they found. LaCour is a grad student but Donald Green is kind of a big deal. Columbia Professor. Meticulous and respected. One professor told us “I trust anything Don Green publishes.”

Green read what Broockman and Kalla found and agreed that unless LaCour could show them other evidence, a retraction was in order.

So LaCour was confronted with the allegations. Green says LaCour denied falsifying data, but couldn’t produce the original surveys. LaCour said he accidentally deleted them. Green says LaCour did admit that he never got grant money to pay survey respondents and never paid anyone.

Green today told me if there was no survey data, what’s incredible is that LaCour produced all sorts of conclusions and evaluations of data that didn’t exist. For instance, he had “a finding comparing what people said at the door to canvassers to what they said on the survey,” according to Green. “This is the thing I want to convey somehow. There was an incredible mountain of fabrications with the most baroque and ornate ornamentation. There were stories, there were anecdotes, my dropbox is filled with graphs and charts, you’d think no one would do this except to explore a very real data set.”

“All that effort that went in to confecting the data, you could’ve gotten the data,” says Green.

I reached out to LaCour who emailed back: “I’m gathering evidence and relevant information so I can provide a single comprehensive response. I will do so at my earliest opportunity.”

Green says that as of yesterday LaCour still claimed the data is real.

Green says one thing that seemed promising about working with LaCour is that LaCour seemed to be lavishly funded; he thanks his funders in his supplemental materials. What’s a shame, Green says, is that lying about this was unnecessary. “Frankly it’s difficult for a graduate student to raise the necessary funding for a study, but if we knew he didn’t have the funding for the study we [his faculty advisors] would’ve raised it ourselves.”

As for the canvassers at the Leadership LAB at the Los Angeles LGBT Center, they say they were blindsided by the news of Green’s retraction. “This is a complete and utter shock to us and we’re still trying to figure out which way is up,” said Steve Deline, one of the organizers of the canvassing. “We had no idea Mike was fabricating data.”

Deline told me that the way it worked is that LaCour gave them lists of people he claimed to have signed up for the online survey. Then canvassers did their jobs and went to those houses. This took hundreds of hours. If LaCour was lying, what a waste.

“Maybe the thing to convey in your blogpost” Green told me “would be something to the effect that, just because the data don’t exist to demonstrate the effectiveness of this method of changing minds, doesn’t mean the hypothesis is false. And now the real work begins.”

He thinks the canvassers should go out and do the study again, for real this time.