Faculty of law blogs / UNIVERSITY OF OXFORD

Tech Platforms as Tribunals

Author(s)

Rory Van Loo
Professor of Law, Boston University

Posted

Time to read

2 Minutes

This post is part of a special series including contributions to the OBLB Annual Conference 2022 on ‘Personalized Law—Law by Algorithm’, held in Oxford on 16 June 2022. This post comes from Rory Van Loo, who participated on the panel on ‘Law by Algorithm’.

‘Amazon is the judge, the jury, and the executioner’ according to one merchant, referring to stories of Amazon causing small businesses to fail by suddenly delisting them. The world’s largest social network has been accused of putting hundreds of women in ‘Facebook jail,’ by suspending their accounts after they posted ‘men are scum.’ And Google’s denial of one law student’s request to stop putting accusations that she slept her way into Yale Law School at the top of search results led to experiences like a law firm partner in an interview saying, ‘Well, you're certainly the most Googleable candidate we've ever had.’ The platform ecosystem’s dispute resolution conjures images of a judicial system wielding meaningful sanctions.

These examples and observations are pulled from my articles exploring the analogy of platforms acting as courthouses and proposing procedural rules enforced by regulators. But they underscore the importance of the broader analogic project of ‘law by algorithm.’ And while Eidenmüller and Wagner admirably avoid either dystopian or utopian narratives, their discussion of tech companies’ dispute resolution, and of the broader challenges of an algorithmic society, ultimately builds the case for having some optimism about the possibility of governmental accountability for these sometimes high-stakes processes that rely so heavily on algorithmic adjudication.

Some have found it strange to propose regulating companies’ dispute resolution processes because those process are internal to the platform. But in other contexts where businesses wield significant influence as intermediaries, the law regulates dispute resolution. For example, credit reports can determine whether someone gets a job, receives a loan, or rents an apartment. In light of those stakes, and the costs of errors, providers of credit reports have a legally imposed duty to investigate when consumers report that information submitted by third parties was inaccurate, among other requirements. Credit card companies must also comply with dispute resolution laws requiring them to follow specific procedures when a customer disputes a charge. And cable companies are required to maintain certain minimums of customer service, such as having a real person available on the phone during certain hours.

Of course, simply because something is done does not make it right. It is difficult to know the counterfactuals and to rigorously measure the costs and benefits of such legal interventions. But at a minimum, these other instances show that it is not abnormal for the law to impose dispute resolution mandates on companies. Moreover, these other industries show that companies subjected to mandated dispute resolution can continue to innovate and earn high profits.

To be clear, the issue of designing procedural regulation is more complicated in tech companies because of their pace of innovation and scale of disputes handled. There are also limits to the judicial analogy. Observers will have differing preferences in balancing the economic, social, and moral considerations required in designing private dispute resolution.

In light of these challenges, while some procedural innovations from the public courts, such as class actions, could be adapted, some automation and creativity will be necessary to apply them at scale. The answer is surely not to take formal procedural rules from public courts and apply them in mass to tech platforms. And not all platforms will require the same rules. For instance, those subject to intense competition are less likely to need rules imposed.

These limitations help underscore the importance of administrative agency involvement. The agency would adapt the procedural rules, including the elimination of regulations that turn out to be too cumbersome or quickly outdated. 

Another source of pushback I have experienced is that dispute resolution by itself will do little to address platforms’ harms to society. I agree, and in various articles have explored antitrust breakups, administrative agency monitoring, mandated data sharing, technology-focused agency, and imposing greater network liability. But there are risks in analytically and legislatively tackling these problems in a piecemeal manner, and there is much more to the task beyond these topics. Eidenmüller and Wagner have given us a more comprehensive and integrated treatment to inform the ongoing construction of a new legal architecture for an era of law by algorithm.

Rory Van Loo is a Professor of Law at Boston University.

Share

With the support of