Crowdsourcing Wage Pledge

Project Lead

Jamie Woodcock – Open University

Contact: jamie.woodcock@open.ac.uk 

Supporting Partner(s) 

Six Silberman, IG Metall, Hannah Johnston, Independent

Challenge

Crowdworking platforms such as Amazon Mechanical Turk often pay workers around two dollars (USD) per hour and most requesters pay less than five dollars. Despite these low wages, findings from a 2018 survey conducted by the research team found that many academic requesters currently set wage targets and would be willing to commit to them publicly.

This project developed a ‘Crowdsourcing Wage Pledge’ to improve wages for crowdworkers, and to thus ensure fairer futures for digital workers in the platform economy.

Working with crowdworkers and stakeholders

Over 100 people responded to a second round survey to better understand how the pledge could be instituted internationally, to discover their needs and concerns, and to help draft the wage pledge. The respondents were mainly from the US, however they also recruited multiple respondents from the UK, Japan, Mexico, Canada, and the Middle East.

From these respondents, they were also able to recruit individuals (all academics involved in crowdsourcing) to take part in consultative meetings and workshops where they presented draft versions of the online website and tool, and conflict resolution mechanism.

Outcomes

The Crowdsourcing Wage Pledge published their website and tool, where people can sign up to use it. The project has also received correspondence from academics involved in consultation – either through workshops or those who were survey respondents – saying they intend to use the wage pledge for their future projects.

Insights

The project demonstrated that academic requesters are overwhelmingly willing to commit to paying workers the minimum wage, but that the available institutional guidance offered to researchers – be it from universities, academic journals, or ethics boards – even where it exists – has been largely insufficient when it comes to questions of how much to pay workers and how researchers can make their wage commitments publicly known. This confirms the need for a mechanism like the ‘Crowdsourcing Wage Pledge’.

When it comes to wage payment levels, the survey revealed that academic requesters are willing to commit to minimum wage levels that correspond to the regulations where requesters’ academic institutions are located; over seventy percent of the respondents either agreed or strongly agreed with this statement and fewer than 10 percent disagreed or strongly disagreed with this statement.

The project therefore developed guidance on what they believed requesters should pay crowdworkers and have posted this on the FAQ portion of the site as well as accompanying logic for how they arrived at this wage level.

They recommend a target wage of $16.54 per hour. $16.54 is the living wage in the United States, calculated in March 2020. They use the US living wage because many Mechanical Turk workers are based in the US, and many research requesters restrict their tasks to US-based workers. 

The guidance on wages has already been used by academic requesters with their respective university boards as a reference in their respective discussions about establishing university policies on the ethical use of crowdsourcing in the UK.

Through workshops, they found that multiple peer-reviewers from academic journals are not willing to accept journal submissions where crowdworkers are poorly paid. This suggests that our project may be a way for academics to signal that they pay fair rates, which is also highlighted on the FAQ page.

Future Directions

They now plan to focus energy on raising awareness about the Crowdsourcing Wage Pledge and recruiting signatories.