Faculty of law blogs / UNIVERSITY OF OXFORD

Regulatory Technology: Eight Policy Recommendations


Eva Micheler
Professor of Law, LSE Law School
Johannes Jiang
PhD Candidate in Law and Research Assistant at Freie Universität Berlin


Time to read

4 Minutes

Recent advances in computer science make it possible to develop technology to facilitate the delivery of regulatory requirements. This technology is referred to as ‘regulatory technology’ or ‘RegTech’. Both the UK Financial Conduct Authority and the Bank of England are currently investigating the use of RegTech. This contribution is based on E Micheler and A Whaley, ‘Regulatory Technology: Replacing Law with Computer Code’ (2019) 20 EBOR 1. It identifies eight policy recommendations to inform the development of RegTech.

The UK Financial Conduct Authority (FCA) and the Bank of England (BoE) have focused their work on RegTech in relation to digital regulatory reporting (DRR). At present each regulated entity delivers reports of certain data points to the regulators. This involves a significant number of manual processes. Both regulators are currently participating in computer science experiments exploring ways of automating the reporting of these data points. In February 2019 the FCA launched the most recent phase of the project. RegTech could, in the future, make use of further automated functionalities, such as smart contracts. Machine learning could be used to help regulators and regulated entities identify risk. Software could be trained to autonomously identify risk as it emerges.

Eight Policy Recommendations

Regulatory technology is only starting to emerge. It is nevertheless possible and useful to identify some big picture issues that regulators are facing in this context and devise recommendations accordingly.

1. Precision is not an end in itself. Financial regulation currently operates through a combination of high-level flexible principles as well as granular technical rules. Software, however, is at present not as capable as natural language to operate flexibly. The FCA/BoE experiments revealed that the expression of even those rules that are currently seen as clear and unambiguous may need to be changed to make the conversion of law into code viable. This may change in the future when technology advances. For the present context, it means that some high-level standards may not be suitable for translation into code. Policymakers thus need to determine for which context high levels of precision are desirable. Even if general principles can be broken down into computer code, such a process of translation would inevitably entail the removal of semantic ambiguities, and lead to higher precision. In the long-term, the use of regulatory technology could well change the regulatory design to more granular rules overall. This may be desirable, but should in any case be the result of a deliberate decision by the policymaker rather than an unintended effect.

2. Avoid short-termism. One way in which RegTech systems may make regulation more efficient is by supplying the regulator with real-time information. This could make the regulator nimbler, but could also lead to unintended consequences. It could invite short-termism on the part of the regulated entities who could become too focused on real-time reporting, orient their business model accordingly and inadvertently overlook longer-term risks.

3. Know the limits of data-based analyses. Data-based analyses depend on an appropriate choice of underlying data. The performance of machine learning and AI applications may be skewed when biases present in the underlying data sets are converted into automated biases and lead to discrimination by algorithms. There is also a risk that the underlying data does not fit the task. Diagnostic tools based on data collected from male individuals can, for instance, cause doctors to overlook female patients presenting with heart attacks. Notwithstanding an appropriate choice of data, regulators should be mindful about over-reliance on AI and machine learning outcomes, particularly the (erroneous) assumption that using such technologies will automatically provide for an objective analysis. Software errors can lead to false results even with appropriate data sets. RegTech should be deployed in a way that enables and encourages decision-makers to understand the scope of the data on which the analysis is based and preserves their ability to exercise independent and accountable judgement.

4. Be aware of systemic risk. A particular concern of employing regulatory technology in financial regulation is systemic risk. At present, each regulated entity develops its own compliance approach. The current rules allow for a variety of equally lawful interpretations and compliance strategies. This facilitates diversity in business models within the financial services industry. If a highly standardised financial technology were used across the entire population of regulated entities, standardisation may ensue, which in turn facilitates herding and increases systemic risk.

5. Technology providers are gatekeepers. Both regulators and regulated entities engage with third party technology providers assisting them in creating regulatory technology. This introduces a new participant into the regulatory space. Those creating and maintaining the technology assume the role of gatekeepers. They have business interests of their own that do not necessarily align with the public interest. Over time this may justify regulatory intervention aimed at supporting their integrity. Overseeing these gatekeepers seems straightforward now. Regulatory technology is developed by a vibrant market of small start-ups. In the background, however, a small number of large companies dominate the market for data analysis and AI. They are potentially also interested in serving the financial services industry. One of their growth strategies is to acquire successful small technology firms. Their dominance in the market makes it all the more important, but also very difficult, to subject them to regulatory oversight.

6. Beware of regulatory capture. There are good reasons for regulators to have their RegTech teams work closely with regulated entities and the start-up community in developing technology. Collaboration provides regulators with access to important expertise. Collaboration may, however, lead to regulatory capture. Such problems are normally addressed by allocating decision-making away from those who directly or closely engage with regulated entities. In the context of regulatory technology, regulators need to make sure that those who are removed from engagement with regulated entities and are taking the policy decisions have access to independent sources of technological expertise. This is all the more important in a situation where policy decisions are inseparably connected with questions of software design.

7. Preserve democratic legitimacy and accountability. Translating legal language into computer code involves the making of policy decisions. These need to be subject to the scrutiny of democratic legitimacy and accountability. Regulators need to ensure that these requirements are preserved when they integrate work from third party providers.

8. Technology is not a regulatory strategy. The financial crisis has shown the limitations of self-regulatory approaches, which rely on regulated entities aligning their business model with the public interest. After the financial crisis a more involved regulatory style has emerged. There may well be reasons to conclude that this new regime is overly burdensome and does not achieve its intended goals. The advent of regulatory technology itself is not a sufficient reason to support a change of approach. Technology serves those who develop it. There is no reason to believe that regulated entities equipped with regulatory technology will find it easier to incorporate the public interest into their business model.

Eva Micheler is an Associate Professor (Reader) in Law at the Department of Law, London School of Economics and Political Science (LSE).

Johannes Jiang is an LLM candidate at the Department of Law, London School of Economics and Political Science, and a Research Assistant at the LSE Systemic Risk Centre.


With the support of