Faculty of law blogs / UNIVERSITY OF OXFORD

Create Law or Facts? Smart Cars and Smart Compliance Systems


Sabine Gless
Professor of Law, University of Basel
Emily Silverman
Senior Researcher at the Max Planck Institute for the Study of Crime, Security and Law.


Time to read

5 Minutes

Poor driving behaviour has attracted regulation ever since motor vehicles hit the streets, and the increasing prevalence of AI could lead to even more supervision of delinquent drivers. AI currently facilitates a vast array of smart compliance systems that interact with human drivers as they manoeuvre their vehicles from starting point to destination. These systems differ in the degree to which they intervene in the driving process. So-called nudging structures—such as intelligent speed assistance and drowsiness warning systems—populate the lower end of the intervention spectrum; their function is to monitor and/or enhance the performance of human drivers. At the other end of the spectrum are the so-called impossibility structures, such as alcohol interlock devices. These structures are more intrusive; indeed, they may intercede to prevent a human from driving at all. While the primary aim of compliance systems is to promote road safety, they also produce secondary effects, some of which may be profound. If, for example, consumer products such as those embedded in cars warn users of non-compliance but at the same time generate evidence that can be used against them in criminal and/or administrative proceedings, it is not unlikely that the average person’s day-to-day conduct and approach to criminal justice will change.

  1. How far can compliance systems go?

One question connected to the use of smart compliance systems in cars involves the extent to which these devices create new ‘facts on the ground’ that should supplement—or perhaps even replace—traffic rules and law enforcement.

    1. Leaving a Choice to Humans (Nudging structures)

Situated on the low end of the spectrum of smart compliance systems, nudging structures leave to humans the choice of whether or not to obey the legal ‘ought’. If the human in question chooses not to obey, the system’s 100% enforcement policy is activated. For example, a ‘section speed control system’ embedded in traffic infrastructure can autonomously read a vehicle’s license plate as the vehicle enters a tunnel, read it again when the vehicle exits the tunnel, and subsequently calculate the vehicle’s average speed. If a vehicle is found to be exceeding the speed limit, a fine is automatically issued by the competent authorities.

Obviously, today’s cars are not reliant on external compliance systems to determine their speed. A contemporary car could easily measure its own speed and, if the speed proves to be excessive, alert the driver by indicating the speed limit and/or the driven speed (as many cars already do). Increasingly serious measures, culminating in the issuance of a speeding ticket, could be taken if the driver ignores the warnings issued by the car.

    1. Legality by Design (Impossibility structures)

Impossibility structures are located at the high end of the spectrum of smart compliance systems (for further information see Michael L. Rich, ‘Should We Make Crime Impossible’, and Christina M. Mulligan, ‘Perfect Enforcement of Law: When to Limit and When to Use Technology’). These structures, which include alcohol interlock devices, bypass human control by harnessing their mechanical capabilities to preclude violations of the legal ‘ought’ entirely. They operate on the basis of a so-called legality-by-design strategy (whereby, it should be noted, legality by design is not a novel option that is available only to smart compliance systems).

Alcohol interlock devices work by measuring a potential driver’s breath alcohol concentration. If it detects alcohol concentration in excess of a certain predetermined level, it activates the vehicle’s immobilizer and prevents the engine from starting. In the past, the installation of alcohol interlocks for all vehicles was required by law in only a few European countries (Sweden, Finland, and France); other countries (Belgium, Poland, and the Netherlands) treated alcohol interlocks as an administrative sanction that could be imposed on drivers convicted of drunk driving. (Regulation in the United States differs from state to state.) Today, Art 6 (1)(b) of the Regulation (EU) 2019/2144 requires all motor vehicles type-approved in the EU to be equipped with alcohol interlock installation facilitation.

    1. Law Enforcement by Vehicles?

Given the existence of smart compliance systems and the emergence of private-public-partnerships, an obvious question to ask is why governments are not outsourcing ‘speed ticketing’. Some businesses are already offering automated traffic enforcement services, in which the entire process, from camera capture to ticketing, is automated and requires only ‘police officer verification’.

The next logical step would be for cars to measure their own speed and to issue tickets accordingly (still, of course, requiring police officer verification). This would seem to be feasible—although rather silly at first blush, or dangerously invasive—given that the stages preceding ticketing are already handled by smart cars, and the necessary systems, such as intelligent speed assistance, are mandatory in new models.

This shows that driving automation, although associated with greater safety on public streets, also poses many new risks, including safety risks associated with human-machine interactions; these, in turn, can have many consequences (foreseen as well as unforeseen). For example, steering a vehicle requires close cooperation between human drivers and assistance systems, but a driver might overestimate the abilities of such systems and, as a consequence, be unprepared to respond immediately to a takeover request, should such a request be issued. This is one reason why Art 6 of Regulation (EU) 2019/2144 requires cars to be fitted with safety-enhancing vehicle systems, including a driver drowsiness and attention warning system and an event data recorder.

Another tangible result of these systems being installed in new cars is a huge new pool of autonomously-gathered evidence, including drowsiness alerts recorded in data storage devices. These data storage devices can be made accessible to law enforcement agencies (including both administrative and criminal justice personnel), as has been done in Germany (see sec 63a, para 2 of the German Traffic Law).

  1. What to Consider on the Road Ahead

Smart compliance systems potentially represent new ways of bypassing human choice to comply with the law and transferring decision-making authority to machines. This leads to many possible legal challenges, which—depending on the smart compliance system at issue—vary in their effect on the criminal justice system as it is known today.

Regarding impossibility structures, questions arise concerning a possible human right to freedom of choice. Arguably, there is even a right to choose to engage in wrongdoing.

Regarding punishment potentially being handed down by a car, questions arise concerning various aspects of the right to a fair trial, the right to due process, and (possibly) the right to adjudication by a human judge.

Regarding constant monitoring (with the aim of gathering evidence), questions arise concerning several different rights, including the right to privacy (right to be let alone), the right to remain silent (nemo tenetur), the presumption of innocence (in dubio pro reo), and the right to confront incriminating evidence (see Gless, ‘AI in the Courtroom’).

Above all, the right to privacy must be taken into consideration. Traditionally, privacy rights have no place in criminal proceedings. But certain smart compliance systems are so clearly reminiscent of George Orwell’s 1984, where telescreens, hidden cameras, and hidden microphones were used to spy on Oceania’s citizens, that the importance of privacy rights must be acknowledged. After all, the clocks may not yet have struck thirteen, but AI has left us with many issues to resolve, and time is ticking.

Sabine Gless is a Professor of Criminal Law and Criminal Proceedings at the University of Basel.

Emily Silverman is a Senior Researcher at the Max Planck Institute for the Study of Crime, Security and Law.

This post is published as part of the series ‘Smart Compliance Systems in the AI Era: Combining Criminal and Administrative Measures’ and is a contribution from the symposium ‘Smart Compliance Systems in the AI Era: Combining Criminal and Administrative Measures’ co-organised by Bar-Ilan Lab for Law, Data-Science and Digital Ethics and Ono Academic College in December 2022.


With the support of