Faculty of law blogs / UNIVERSITY OF OXFORD

Down by Algorithms? Siphoning Rents, Exploiting Biases and Shaping Preferences – The Dark Side of Personalized Transactions

Author(s)

Gerhard Wagner
Chair for Private Law, Commercial Law and Law and Economics at the Law Faculty of Humboldt-University, Berlin

Posted

Time to read

4 Minutes

The rise of big data and artificial intelligence creates novel and unique opportunities for B2C transactions. Businesses assemble or otherwise gain access to comprehensive sets of data on consumer preferences, consumer behaviour and consumer resources. Based on the analysis of that data, they profile consumers. This leads to stark and novel forms of information asymmetries: businesses know at least as much about consumers as they know about themselves, and sometimes even more. Businesses use smart sales algorithms to market their products and services, microtargeting idiosyncratic consumer preferences with highly personalized offers. As a consequence, it appears that firms have enormous leverage to shape private transactions—knowledge is power.

Much of the existing literature on the effects of these new technological developments on B2C transactions assumes that businesses will make “benign” use of the opportunities created by big data and artificial intelligence. It has been suggested, for example, that big data analytics can help increase customer experiences, satisfaction and loyalty. More generally, personalized online shopping promises a cure for the problem of decision-making paralysis caused by an abundance of options. The more options are available, the less able consumers are to exercise judgment and to make a choice. It is here that algorithmic shopping offers real progress: in limiting the options for consumers, these algorithms facilitate choice and thus the optimal satisfaction of “real” consumer preferences. This is good news especially for risk-averse consumers: recommendations based on what similar consumers have liked or bought offer a carefree road into new territory. Hence, the bright side of personalizing transactions appears to keep the promise of more precisely meeting consumer demand at lower transaction costs. Scholars have also investigated the potential of a data-based personalization of contract law default rules. Businesses, it is hoped, will increasingly offer specific contract terms that best meet consumers’ preferences. All in all, personalizing B2C transactions appears to promise increased efficiency—both on the micro (transactional) and on the macro (societal) level.

By contrast, in our article we seek to systematically explore and understand crucial aspects of a potential dark side of personalized transactions. Big data and artificial intelligence may enable businesses with access to the data and the required technology to effectively personalize their interactions with consumers in order to exploit informational asymmetries and/or consumer biases in novel ways and on an unprecedented scale. Incentives to take advantage of “naïve” or biased consumers certainly exist, and competitive pressures may force businesses to engage in exploitative practices. At first sight, this poses stark challenges both to market efficiency and to individual autonomy and agency.

Our aims are threefold: First, we seek to identify the various effects that personalization has or may have on B2C transactions in terms of efficiency and distribution—of rents in a microeconomic sense—and on individual autonomy and agency. Our focus here is on the contract formation stage, but the effects surely also show in contract enforcement and dispute resolution. Second, we analyse whether, and to what extent, there is a regulatory need to counteract identified detrimental effects by reference to specific regulatory objectives such as efficiency or fairness. Third, we examine the regulatory tools that might be employed to this end and assess their comparative merits. Any regulatory intervention must aim to carefully balance the goal of correcting identified negative effects of personalized B2C transactions with the goal of not stifling innovation and the beneficial effects of big data and artificial intelligence for businesses, consumers, and society as a whole. Our focus here is on contract law. At the same time, the available regulatory tools include self-help remedies: increasingly, consumers are gearing up in a technological and intelligence arms race to protect their private sphere and bargaining power vis‑à‑vis businesses.

We identify three aspects of the dark side of personalized B2C transactions as particular areas of concern. First, businesses increasingly engage in first-degree price discrimination, siphoning rents from consumers. Second, firms exploit well-known behavioural biases of consumers such as, for example, their inability to correctly assess the long-term effects of complex transactions or their insufficient will-power, in a systematic fashion. And third, businesses use microtargeted ads and recommendations to shape consumers’ preferences and steer them into a particular consumption pattern, effectively locking them into a lifestyle determined by their past choices and those of likeminded fellows.

At first sight, siphoning rents, exploiting biases and shaping preferences appear to be relatively distinct phenomena. However, on closer inspection, these phenomena share a common underlying theme: the potential exploitation of consumers or at least an impoverishment of their lives by firms who apply novel and sophisticated technological means to maximize profits. Hence, the dark side of personalized B2C transactions may be characterized as consumers being “brought down by algorithms”, losing transaction surplus, engaging in welfare-reducing transactions and increasingly being trapped in a narrower life.

It is unclear whether first-degree price discrimination creates an efficiency problem, but surely it raises concerns of distributive justice. We propose that it should be addressed by a clear and simple warning to the consumer that she is being offered a personalized price and, in addition, a right to indicate that she does not want to participate in a personalized pricing scheme. Similarly, behavioural biases may or may not lead consumers to conclude inefficient transactions. But it appears that they should be given an opportunity to reflect on their choices if these have been induced by firms applying exploitative algorithmic sales techniques. Hence, we propose that consumers should have a right to withdraw from a contract concluded under such conditions. Indeed, in many jurisdictions they already have such a right today. Finally, shaping consumers’ preferences by microtargeted ads and recommendations prevents consumers from experimenting and leading a multifaceted life. We should have a right to opt out of the technological steering mechanisms created and utilized by firms that impoverish our lives.

An important concern one can have with respect to the features of the dark side of personalized transactions discussed in this article arises from the reinforcement of prejudices and associated discriminatory practices. This concern informs our analysis. At the same time, we do not focus on it. The siphoning rents, exploiting biases and shaping preferences problems are problems partly or even primarily for other reasons than their effects on inequality and prejudice in a society. In other words: such discriminatory effects might well make these problems worse. But siphoning rents, exploiting biases and shaping preferences would still raise problems even if they did not have discriminatory effects.

Gerhard Wagner is the Chair for Private Law, Commercial Law and Law and Economics at the Law Faculty of Humboldt University, Berlin.

Horst Eidenmueller is the Freshfields Professor of Commercial Law at the University of Oxford.

Share

With the support of