In India’s competitive marketplace, a plan to protect gig workers
In the past few weeks, anonymous Twitter accounts such as Swiggy DE and Deliverybhoy have made allegations regarding issues faced by delivery partners of food delivery apps. These include low payouts, opaque payout calculations and alleged cheating, unexplained differences in surge rates, order clubbing and assignments to avoid incentive pay, and zone extensions to avoid return bonuses. Swiggy and Zomato, which offer delivery work to more than 360,000 gig workers, have insisted that earnings per order are much higher than alleged, and that full-time personnel earn over ₹20,000 per month.
India’s gig economy is among the few sectors offering flexible work to millions and is recognised as a growth sector. It is important, therefore, to examine these grievances and design policy mechanisms that protect worker rights.
Many of the grievances arise because of a trust deficit between the gig workers and the platforms. India has protected workers through heavy-handed industrial regulation and archaic labour laws, which suit the factory floor. They are irrelevant, insufficient, and ineffective in addressing disputes that originate on these platforms.
With the apparent oversupply of gig workers, the platform’s incentive is to deliver orders at the lowest marginal cost (a large component of which is gig worker fees) while keeping the customer happy. This task is assigned to algorithms. An analysis of the grievances suggests that many are linked to the way gig work is assigned (denial of high-profit surge or incentivelinked orders), performed (clubbing orders, zone boundaries), and rewarded (complex, multifactor payment calculations).
There are several factors in each of these algorithmic decisions. Work allocation can be based on weather, restaurant and customer locations, traffic, prevailing wages, and the available worker pool. The algorithms that make these decisions are flexible, learning algorithms that can account for the constantly changing input. Machine tion techniques support millions of or every day.
Crucially, most of the inner workin these techniques are unknowable, eve engineers who design them. Such a rithms usually include biases based on t training data. For example, a profit-m mising algorithm may deny orders to workers that are eligible for incenti even without being programmed to do
However, outdated, static mechani such as grievance redressal officers or o ous labour laws cannot keep with the gig economy. Instead power of technology must be nessed to improve trust betw platforms and gig workers.
Algorithm audits are one s technique, where an auditor access to the algorithms and ex ine the results produced by th Suitably qualified auditors c uncover implicit or explicit bia or other shortcomings of s algorithms. Another techniqu the use of “sock puppets” w researchers use computer grammes to impersonate accounts. Auditors can use t accounts to identify insta where the platform algorit produce undesirable results. O auditing techniques can also be used.
In a competitive marketplace, infor consumers can prioritise ordering f platforms that subject themselves to s audits. Workers may also choose to w for more transparent platforms. Regula can examine work conditions as a func of work allocation, performance, and related to each gig, and mandate trans ency related to each of these.
If successful, this approach can be r cated in other industries. The di between algorithm makers, platform c tors, investors that support them, and workers is real. Policymaking that m dates transparency can improve trust ensure the welfare of gig workers while impeding the growth of the gig econo