“Platform work changes the very nature of the employment relationship”
an intervew with Veena Dubal (University of California)
“A few weeks ago, a San Francisco police report described an incident involving an e-bike and a Zoox, one of Amazon’s self-driving robo-taxis. Let’s call the rider Raja—a name that represents the hundreds of workers I’ve spoken to over the past fifteen years while studying on-demand labor. According to witnesses, Raja got up, gathered his belongings, and fled. He refused help and didn’t wait for the police.
The once-bustling commercial area along Market Street in San Francisco is now largely empty but serves as a gathering place for hundreds of workers like Raja waiting for work—mostly undocumented immigrants from Nepal. Instead of waiting for a contractor to pick them up, they obsessively check their smartphones, hoping a machine-learning system will find them. They aim to earn $10–12 an hour, about half San Francisco’s minimum wage. Many rent their bikes, taking on financial risk in the event of an accident.
The robo-taxi had passed by Raja and his colleagues several times a day for months. They viewed it as a technological marvel—and as a machine meant to replace them. But it is also a surveillance device, recording the movements of anyone nearby. With its sensors active, the robo-taxi almost certainly recorded Raja—who, perhaps from a convalescent bed, wondered whether the police department would use the footage to track him down and report him to immigration authorities. At the same time, across town, the customer who had ordered the food spilled during the incident got tired of waiting. They logged back into the app, filed a complaint via chatbot, and placed a new order. The complaint triggered an automated decision-making process, which may have resulted in a temporary or permanent suspension of Raja’s ability to work as a rider”.
Veena Dubal teaches at the University of California and studies the impact of technology on workers’ rights. California and San Francisco are good places to do so, and for more than a decade, Dubal has conducted hundreds of interviews. She was a guest at a conference on platform labor rules and organization, held by the Political and Social Sciences Class of the Scuola Normale Superiore. We spoke with her about what she calls “algorithmic wage discrimination.”
In your work you argue the way platform work is regulated and compensated changes the very nature of the employment relationship.
“Algorithmic wages were first introduced in what we call the gig economy—companies that pretend their workers are independent contractors. Initially, in food delivery, workers earned a fixed percentage of the delivery cost—a very low but predictable payment. That percentage was the same for everyone. Suddenly, things changed, and people started receiving different pay for the same job. Two workers would be offered the same task with different compensation. This reflects price discrimination applied to consumers—airfare that varies based on your location, shopping habits, or whatever algorithms determine your willingness to pay more or less. The same logic is now used to set wages in the platform economy.
The emergence of hourly wages during the Industrial Revolution transformed how people understood time, but it also became the basis for the eight-hour workday, overtime pay, and the idea of a fair wage—a connection between time and pay that workers could not only predict but negotiate. Algorithmic wage discrimination breaks that link between time and compensation. You don’t know how long you’ll have to work to earn a certain amount. It also undermines the principle of equal pay for equal work—the idea that you and I, with the same skills doing the same task for the same duration, should earn about the same. This isn’t just about driving wages down and increasing precarity—it’s an anti-union strategy. Differentiated pay is a way to divide workers”.
How do workers experience this precarity? Are they aware of their situation?
I studied a group called Rideshare Drivers United, an unofficial union with over 20,000 members. I’d often hear: “I just want to go back to the system where we got a percentage per ride,” and I used to wonder why they’d want to return to such minimal pay. Over time, I realized it was about certainty. Today, they receive a price and must decide whether to accept it—without knowing how that price is determined. Their feelings weren’t just frustration or anger—they often described the job as a gamble. A good day encourages you to keep playing. A bad one makes you want to play to make up for losses. It’s more than a metaphor—platform companies use the same psychological insights as online gambling firms. There’s uncertainty, excitement—just like gambling.
Let’s talk about rewards, incentives, and similar systems—these too seem deliberately unclear.
One new practice involves paying workers through a debit card linked to the app. Sometimes they’ll say, “Looks like you’re struggling. Need a loan? You can repay it from your wages.” But wages are algorithmically set, and while repaying the loan, the pay per delivery might drop—so it takes more trips to repay it. Or there’s an incentive to go to an area with fewer workers, only to discover there’s little work there too, so even with higher rates, you earn less overall. If you were an employee, you’d have a stable wage, and they’d simply say, “Go cover that neighborhood.” Then there are bonuses, used to retain workers since having a large pool available is crucial for efficiency. A bonus might say: “Do 50 deliveries and get an extra $100.” But as you approach 50, the number of orders slows down—so you stay online longer instead of quitting early after hitting the target.
Peter Thiel, one of Silicon Valley’s billionaires and a Trump supporter, once said technology was a way to change the world without consent or politics. And while tech has transformed our world, it hasn’t eliminated politics—it’s pushing it in an authoritarian direction. AI is being used to scan surveillance videos to identify protesters’ locations or undocumented workers—or to find out which foreign students have said or written the “wrong” things. In the workplace, technologies refined to hire, fire, evaluate, manipulate, and pay gig workers have now spread and been embedded in AI software systems purchased by employers across industries. These systems are already being used to hire, evaluate, and fire healthcare workers, logistics staff, IT professionals, and factory workers. It’s hard to imagine a more dystopian system.
(picture: Cjp24, CC BY-SA 4.0 <https://creativecommons.org/licenses/by-sa/4.0>, via Wikimedia Commons)
Journal Article - 2025
Journal Article - 2023
Journal Article - 2023
Journal Article - 2023
Journal Article - 2023
Monograph - 2023
Monograph - 2022
Monograph - 2022
Journal Article - 2021
Journal Article - 2021