Faculty of law blogs / UNIVERSITY OF OXFORD

Legal standards: an engineer’s perspective

Author(s)

Richard Parnham
Postdoctoral Research Fellow

Posted

Time to read

2 Minutes

Early in the spring term, Oxford University’s AI for English Law research team enjoyed a talk delivered by one of Google’s first 10 employees. Harvard University’s Ron Dolin, who is a qualified lawyer and holds a PhD in computer science, teaches on the impact of technology on the sector. He also invests in lawtech startup companies.

A key element of Dr Dolin’s talk focused on the benefits of applying engineering principles – including quality standards – to the legal sector: a challenging concept, he said, given that such metrics were “horribly lacking”.

One example of an industry-wide standard, which would benefit the legal sector, would be an agreed set of XML mark-up tags for contracts and court judgments, Dr Dolin suggested.  The failure of existing stakeholders to make use of such tags meant that millions of dollars had to be spent trying to capture data from these documents retrospectively. “That’s outrageous,” he said.

Moving onto the quality of legal services, Dr Dolin expressed scepticism about the often-suggested trade-off between automated legal services and legal services delivered by lawyers. “First of all, are you measuring how good the human [lawyer] is?” he asked, pointedly. Automated legal services, he suggested, should not be judged against the quality of the reasonable lawyer. At the very least, he suggested, such services should be judged against the quality of services offered by the “worst allowable lawyer.” Indeed, he then went further, suggesting the that the quality floor for technology-enabled legal advice should be anything that was better than no advice at all.

Dr Dolin stressed that he was not advocating the complete abolition of quality standards for legal services. Rather, he suggested that a level playing field between humans and machines in relation to the minimum quality levels that should be acceptable. Under his suggested “e-certification” regime, online legal service providers would be required sit the equivalent of a bar exam in whatever area of law they were effectively offering.

Turning to the eDiscovery sector, Dr Dolin expressed his regret that solutions vendors appeared to have ceased competing against each other by reference to definable quality metrics, even though it was “completely do-able” for them to do so. Previously, such comparisons had been undertaken in the US by the National Institute Of Standards and Technology (NIST) using the publicly available Enron litigation dataset, he recalled.

Standardised performance benchmarking could also help the expansion of the contracts analytics market, Dr Dolin then suggested. “It’s holding up the market that we don’t have those benchmarks... barriers to adoption just shatter in the face of quality metrics,” he said.

Rounding off his talk, Dr Dolin restated his long-standing call for an independent organisation to step forward, and take a lead in developing and – crucially – measuring the ROI impact of quality standards across the legal sector. Organisations such as NIST might be an appropriate vehicle for undertaking this work, he said, because they already have the necessary infrastructure in place. However, the lawtech industry would need to take the lead in requesting that such testing should occur. “They’re not going to come to you,” he added.

Ron Dolin’s paper, Measuring Legal Quality, is available for SSRN.

Share