A penny for your bots
MTurk, Amazon’s crowdsourcing marketplace, uses an army of digital workers to perform ‘micro tasks’ for companies and academic institutes. But a former worker says the pay can be so low it’s exploitive
From the privacy of her own home, Kristy Milland has taught smart cars to dodge children, trained drones to fire at humans and tagged graphic videos made by terrorist groups — all through tech giant Amazon’s “crowdsourcing marketplace.”
At its best, Amazon Mechanical Turk is a low-barrier source of income that lets you earn from your living room. At its worst, it’s a penny-paying platform fuelled by a workforce with little protection against poor compensation, wage theft or trauma on the job.
Since the time Milland began working through MTurk 15 years ago, the platform’s pool of workers has grown to an estimated 100,000 individuals. It’s an increasingly important tool for everyone from tech giants to academic institutions — including Canadian universities.
“It’s good and it’s bad. It’s good because, OK, here’s an opportunity for (workers) to work in a way that is not illegal, that is not impossible, that is not physically demanding,” Milland says.
“Yet at the same time that allows requesters to say, well, then we can exploit them.”
Like its growing field of competitors, MTurk connects employers to a disparate pool of digital workers, nicknamed Turkers, who are paid to perform online “micro tasks.” The gigs, called Human Intelligence Tasks (HITs), include everything from tagging photos to completing surveys.
Requesters set the pay rate they see fit; the majority pay under a dime, a minority can earn you a few dollars. Some may take seconds to complete, others over an hour. The common thread: humans are still better at performing these tasks than machines. The platform derives its name from an 18th-century chess-playing “machine” that was later revealed to be a hoax; the device was in fact being operated by a human.
For academic researchers, MTurk provides easy access to “participant pools” that allow them to conduct surveys. Ethically, researchers can’t coerce participants into joining a research study; compensation of some form is common, but guidelines on what constitutes fair, ethical payment vary. (For example, a general study of health research practices conducted last year by the Torontobased Wellesley Institute found that the median hourly amount provided to participants was $25.)
Milland, who is now a law student at the University of Toronto, co-authored a report in 2017 with the University of Oxford, Singapore Management University and Carnegie Mellon University that analyzed millions of tasks and found that Turkers earned a median hourly wage of about $2.35.
Amazon did not respond to the Star’s request for comment.
Like most in the gig economy, Turkers are independent contractors and therefore excluded from employment protections — including minimum-wage standards.
Milland’s own experience with the platform is mixed. It became a significant source of income for her family around 2010 when her husband lost his job. At that time, it was fairly lucrative; using scripts — tools that help perform MTurk work faster — Milland says she was able to pull in around $66,000 a year.
Crowdwork can also be an opportunity for workers who face discrimination in the job market or have physical restrictions. But Milland says the platform has changed significantly in recent years. For one, programmers began creating software that scoop up the more lucrative requests, leaving only low-paid tasks for most workers.
Even as a highly experienced worker, Milland says she would now be lucky to make $20 to $30 a day.
Some of the work, she adds, can be traumatizing. She has tagged videos made by the fundamentalist Islamic State, and reviewed content that showed animals being tortured.
“It was still really bad and actually I had to stop very quickly because I couldn’t handle doing it,” she says. “For other people who don’t have that ability to get out of it, they may just have to do that until they break down and can’t do it anymore. That’s horrifying.” Much of the work on MTurk, Milland says, would in the past have been performed by contractors hired by tech companies. MTurk itself notes that it hosts gigs that have traditionally “been accomplished by hiring a large temporary workforce, which is time-consuming, expensive and difficult to scale.”
Some of the HITs posted by universities even replace tasks traditionally done by research assistants, according to Milland, such as updating citations.
“There are an abundance of people who are well-educated on the platform and able to do these tasks, and academics do leverage that,” she said.
Currently, Canadian workers represent about two per cent of MTurk’s overall workforce, said New York University data science professor Panos Ipeirotis.
The Star asked the University of Toronto, Ryerson, and York universities whether they tracked their academics’ usage of MTurk or have any guidelines around the use of the platform. All three institutions said researchers must abide by their employers’ research ethics standards, but none had developed MTurk-specific policies or tracked its use.
According to one site that compiles reviews and self-reported data from MTurk workers, the average hourly wage paid by one researcher who identified themselves as a University of Toronto affiliate was $9.14 — $5 below Ontario’s minimum wage. Another set of requests by an account called “University of Toronto Soc. Studies” paid almost $18 an hour. One account called “Active Vision Lab York University Canada” paid around $10 an hour.
In response to questions from the Star, a spokesperson for the University of Toronto said all research using human subjects sanctioned by the university is reviewed by the Research Ethics Board (REB). “The REB ensures any compensation is appropriate for the time involvement,” the spokesperson said.
Janice Walls, York University’s acting director of media relations, told the Star a “fundamental principle related to participation in research is that it is voluntary.”
“Many researchers choose to compensate participants for their time and effort, however, participation in a research study is not meant to be employment and as such compensation is not a rate of pay,” she said.
While ethics standards forbid research subjects from being coerced into participating, Milland says academic institutions are ignoring a basic reality about using MTurk.
“This is a workplace,” she says. “(Workers) are not doing it for fun.”
The Star signed up to work on MTurk using an existing Amazon account. The terms of service with the platform require workers to use their “human intelligence and independent judgment to perform tasks in a competent and workmanlike manner.” They also forbid workers from using “robots, scripts, or other automated methods” as a substitute for independent judgment.
Over the course of an hour’s work on MTurk, the Star filled out a survey about shopping habits; examined various car dealerships’ websites and extracted contact information; and analyzed academic charts to write summaries of them. This work added up to a value of $2.02, but to date, only seven of 14 submissions have been approved by their requesters. One was rejected and did not receive payment. So far, the Star has earned 61 cents.
That’s not indicative of how much professional Turkers are earning: No scripts were used to speed up tasks and, as a newcomer to the site, it took time to find higher-paying HITs. The best-paying available post paid $28 to transcribe 58 minutes of audio, but it required qualifications — meaning not every Turker is eligible to complete the work.
Working through MTurk involves a significant amount of unpaid labour looking for HITs, Milland’s research shows. That, she says, “bottoms out” potential earnings. And, her study found, another source of lost income stems from MTurk requesters rejecting work that’s already been completed.
“Once a worker submits completed work, the employer can choose whether to pay for it. This discretion allows employers to reject work that does not meet their needs, but also enables wage theft,” says a 2015 study by University of California, San Diego professor Lilly Irani.
Rejections also affect Turkers’ approval ratings and ability to access work.
Six Silberman, an engineer and programmer who now works for the German union IG Metall, helped set up a website called Turkopticon that “helps the people in the ‘crowd’ of crowdsourcing watch out for each other.”
Silberman says, with the possible exception of California where laws around classifying workers as independent contractors are stricter, there is little argument that the contractor designation applies to Turkers. But, he adds, crowdsourcing platforms and companies who use them could do much more to ensure fairer pay rates and provide better mechanisms to resolve disputes.
“The requester is in a position to resolve some of these problems. And there are some problems that only the platform operators are in a position to solve.”
Some institutions, like the University of Waterloo and the University of California, Berkeley, have developed guidelines for researchers using MTurk. Waterloo’s guidelines note that “in the spirit of fairness,” researchers should provide compensation that is similar to tasks of “similar length and difficulty.”
Meanwhile, other responses are emerging. U.S.-based academic survey consulting company MTurk Data, for example, helps universities make requests on MTurk’s platform.
Part of the guarantee: the group will pay participants an average wage of $16 an hour.
Programmers began creating software that scoop up the more lucrative requests, leaving only low-paid tasks for most workers