The Very Human Labor That Powers Artificial Intelligence – The Nation

(Gorodenkoff / Shutterstock)

Join the Nation Festival for four days of essential conversation and commentary in the wake of the 2020 election.

Join the Nation Festival for four days of essential conversation and commentary in the wake of the 2020 election.

In 2015, Caroline Sinders was working as a design researcher at IBM when she began to have questions about how Watson, the companys artificial intelligence system, was being developed. AI systems like Watson must be trained with data setsfor example, given a large batch of confirmed photographs of stop signs from different angles, in different lighting, and of different quality to be able to recognize stop signs on their own. Sinders was curious about these data sets: The process of correctly categorizing millions of data points seemed like a herculean task in its own right; where, exactly, was all this data coming from?Ad Policy

A lot of my coworkers were like, I dont know why youre asking us these questions, were just supposed to build this system out, she recalls.

While Sinderss coworkers may have been able to push the question aside, finding the answer of where the data sets necessary to train artificial intelligence systems come from eventually led her to the world of crowd-working platforms. Amazons Mechanical Turk and other platforms such as Fiverr, Clickworker, Microworker, and Crowdcloud allow employers to offer workers repetitive, task-based assignments at flat rates. Because all the work is digital, workers from around the world perform tasks on these platformsand given that crowd-workers are considered independent contractors, minimum wage laws dont apply.

Artificial intelligence systems, Sinders discovered, still depend on very human grunt work for their raw material. To illustrate the human cost of artificial intelligencethe millions of hours of work that have gone, and continue to go, into making the most profitable AI possibleSinders created the Technically Responsible Knowledge Wage Calculator.Crowdlabor

As its name suggests, the TRK Wage Calculator allows users to calculate earnings for crowd-working assignments. The calculator features three sliders: number of images to label (labeling data is a common task in artificial intelligence development), price per task, and time per task. The results are bleak. According to the calculator, an average assignment, labeling 3,200 images for 32 cents each at a speed of two minutes per image, would yield a worker less than $63 per daywhich, the calculator points out, works out to less than minimum wage in Washington, the state in which Amazon is headquartered.

The TRK Wage Calculator does not capture the full complexity of crowd-working, nor is it intended to. While most workers on Mechanical Turk reside in the United States (there are more than a million workers worldwide), it goes without saying that they arent all found in Washington. The location was selected by Sinders because it is both the state with the highest minimum wage in the United States and home to Amazon. These details contribute to the conception of the TRK Wage Calculator as a mathematical provocation, rather than a genuine toolan artful way of calling out Amazon for allowing the ruthless exploitation of workers, which it could easily prevent by setting parameters for pricing. (Amazon did not respond to multiple requests for comment for this article.)

Sinders created the TRK Wage Calculator as part of her residency with the Mozilla Foundation. The foundation is responsible for the development of the Firefox web browser and, more generally, is dedicated to keeping the Internet open and accessible. Sinderss prompt during her residency was to explore artificial intelligence, which she decided to pursue because of lingering questions from her time at IBM and her own ongoing project, Feminist Data Set, examining how the development of artificial intelligence can integrate feminist values like fair compensation.Current Issue

Subscribe today and Save up to $129.

Using intersectional feminism as a way to interrogate machine learningthats an interrogation of how you label and frame data, she explains. So then you have to think about labor, right?

For further perspective into the development of artificial intelligence, Sinders joined Mechanical Turk as a worker and interviewed other crowd-workers in the United States and India. (After the United States, India is the second most prevalent nationality of workers on Mechanical Turk.) The experience and interviews revealed how even an apparently fruitful crowd-working assignment can become a morass of additional tasks.

What I realized was that time isnt really listed a lot in the interfaces, but time is a major component of how you do work, Sinders explains.

With crowd-work priced per assignment or per task, the amount of time involved is obscured. A $50 assignment might involve six hours of workor 16 hours of work. The former is one day of work at the federal minimum wage in the United States; the latter is two days of work well below the minimum wage (or one especially hellish day of overtime at the same rate). Even well-meaning employers on crowd-sourcing platforms may underestimate the amount of time necessary to complete their assignments, especially if they lack experience completing the tasks themselves and fail to account for realities like fatigue, to say nothing of break time for meals or bathroom visits. Such genuine misestimation is compounded by outright lowballing, resulting in Mechanical Turk workers earning an estimated $2 per hour on average.

If you like this article, please give today to help fund The Nations work.

When you dont think about time, it doesnt account for how you can be paid radically differently, says Sinders. If someone were like, Please label these 1,000 images and heres $50, that may sound really good at firstexcept if you ask how long it takes.

Besides Mechanical Turk, one of the other crowd-working platforms that Sinders focused on when creating the TRK Wage Calculator was CrowdFlower, which was acquired by Appen in 2019. Appen now provides artificial-intelligence-related services to Amazon, Microsoft, Adobe, and other tech companies.

Appen collects and labels images, text, speech, audio, video, and other data used as training data to build and continuously improve artificial intelligence systems, explains Brian Reavey, a director at the company. Our platform allows clients to specify tasks needed for their specific project. For example, a company working on voice commands to improve a vehicles infotainment system might request spoken words or phrases that are commonly used while driving a vehicle in a variety of languages and dialects.

We have over 1 million crowd-workers around the world, says Reavey, and our Crowd Code of Ethics is our commitment to pay fair wages to all members of the crowd.

While Reavey touts Appens code of ethics, the commitment regarding fair pay is vague, stating only that it is the companys goal to pay above minimum wage. Appen says that it tries to comply with local minimum wage standards in the 130-plus countries where it operates by utilizing AI to predict how long specific tasks should takewhich only brings us full circle back to the black box of AI development, and Sinderss first question of where IBMs data sets were coming from. Yet whether Appen is as rigorous in its determination of minimum wage as the TRK Wage Calculator is exactly the sort of question that Sinders hoped to elicit with her project.

It can be used by workers themselves as an advocacy tool, she explains, but I was more interested in trying to have smaller start-ups, research labs, artists, and individuals who use things like Mechanical Turk, Fiverr, and CrowdFlower really understand that when they offer to pay for work, if they dont have an understanding of time and how much that work really costs, they are complicit in underpaying.

See the article here:
The Very Human Labor That Powers Artificial Intelligence - The Nation

Related Posts
This entry was posted in $1$s. Bookmark the permalink.