This post originally appeared on the TechTank blog.
In the course of my graduate work at Harvard University, I paid hundreds of Americans living in poverty the equivalent of about $2 an hour. It was perfectly legal for me to do so, and my research had the approval of my university’s ethics board. I was not alone, or even unusual, in basing Ivy League research on less-than-Walmart wages; literally thousands of academic research projects pay the same substandard rates. Social scientists cannot pretend that the system is anything but exploitative. It is time for meaningful reform of crowdsourced research.
This is what crowdsourced research looks like. I posted a survey using Mechanical Turk (MTurk), a website run by Amazon.com. Across the country, hundreds of MTurk workers (“turkers”) agreed to fill out the survey in exchange for about 20 cents apiece, and within a few days I had my survey results. The process was easy, and above all, cheap. No wonder it is increasingly popular with academics; a search on Google Scholar returns thousands of academic papers citing MTurk, increasing from 173 in 2008 to 5,490 in 2014.
Mechanical Turk is a bargain for researchers, but not for workers. A survey typically takes a couple minutes per person, so the hourly rate is very low. This might be acceptable if all turkers were people with other jobs, for whom the payment was incidental. But scholars have known for years that the vast majority of MTurk tasks are completed by a small set of workers who spend long hours on the website, and that many of those workers are very poor. Here are the sobering facts:
- About 80 percent of tasks on MTurk are completed by about 20 percent of participants that spend more than 15 hours a week working on the site. MTurk works not because it has many hobbyists, but because it has dedicated people who treat the tasks like a job.
- About one in five turkers are earning less than $20,000 a year.
- A third of U.S. turkers call MTurk an important source of income, and more than one in ten say they use MTurk money to make basic ends meet.
Journal articles that refer to Mechanical Turk. Source: PS: Political Science and Politics
It is easy to forget that these statistics represent real people, so let me introduce you to one of them. “Marjorie” is a 53-year-old woman from Indiana who had jobs in a grocery store and as a substitute teacher before a bad fall left her unable to work. Now, she says, “I sit there for probably eight hours a day answering surveys. I’ve done over 8,000 surveys.” For these full days of work, Marjorie estimates that she makes “$100 per month” from MTurk, which supplements the $189 she receives in food stamps. Asked about her economic situation, Marjorie simply says that she is “poverty stricken.”
I heard similar stories from other MTurk workers—very poor people, often elderly or disabled, working tremendous hours online just to keep themselves and their families afloat. I spoke to a woman who never got back on her feet after losing her home in Hurricane Rita, and another who had barely escaped foreclosure. A mother of two was working multiple jobs, plus her time MTurk, to keep her family off government assistance. Job options are few for many turkers, especially those who are disabled, and MTurk provides resources they might not otherwise have. But these workers that work anonymously from home are isolated and have few avenues to organize for higher wages or other employment protections.
Once I realized how poorly paid my respondents were, I went back and gave every one of my over 1,400 participants a “bonus” to raise the survey respondent rate to the equivalent of a $10 hourly wage. (I paid an additional $15 to respondents who participated in an interview.) This cost me a little bit more money, but less than you might imagine. For a 3-minute survey of 800 people, going from a 20-cent to a 50-cent payment costs an additional $240. But if every researcher paid an ethical wage, it would really add up for people like Marjorie. In fact, it would likely double her monthly income from MTurk.
Raising wages is a start, but it should not be up to individual researchers to impose workplace standards. In this month’s PS: Political Science and Politics, a peer-reviewed journal published for the American Political Science Association, I have called for new standards for crowdsourced research to be implemented not only by individual researchers, but also by universities, journals, and grantmakers. For instance, journal editors should commit to publishing only those articles that pay respondents an ethical rate, and university ethics boards should create guidelines for use of crowdsourcing that consider wages and also crowdsourcers’ lack of access to basic employment protections.
The alternative is continuing to pay below-minimum-wage rates to a substantial number of poor people who rely on this income for their basic needs. This is simply no alternative at all.
The Brown Center Chalkboard launched in January 2013 as a weekly series of new analyses of policy, research, and practice relevant to U.S. education.
In July 2015, the Chalkboard was re-launched as a Brookings blog in order to offer more frequent, timely, and diverse content. Contributors to both the original paper series and current blog are committed to bringing evidence to bear on the debates around education policy in America.