When Good Incentives Are Not Enough: The Quest for Financial Data Standardization
Posted
Time to read
Data standardization—the process of developing and implementing rules for how data are described and recorded—offers significant benefits for financial firms and regulators. Nonetheless, the topic has received only scant attention from academics, and regulatory efforts to promote standardization lag far behind the levels that would be optimal. In a new paper, 'The Data Standardization Challenge,' we examine the myriad public and private benefits from standardization and the many frictions favoring the status quo. We further use data standardization as a case study that illuminates the banal but meaningful forces that often lead to significant gaps between optimal and actual financial regulatory policies.
Standards are ubiquitous in our lives. For example, we and industry routinely use scanners enabled by standard bar-code technology to identify items for purchase, assembly, inventory management, or shipping. By making identification transparent, that technology not only improves accuracy and speed and cuts costs, but it has helped to revolutionize supply-chain management.
In finance, data standardization takes many forms. One involves the use of entity identifiers to understand with precision who is who, who owns whom, and who owns what. The Legal Entity Identifier, or LEI, is a data standard—like a bar code for precisely identifying parties to financial transactions. Other standards help to identify financial instruments—what is owned or borrowed. Examples include the International Securities Identification Number (ISIN), and the Unique Trade Identifier (UTI) and Unique Product Identifier (UPI) for derivatives transactions.
Standardization is essential to compare and aggregate data and can enable firms and regulators to produce more accurate and timely information about a host of issues at lower expense. Firms benefit by better understanding their clients’ needs, and in pricing and managing risk. Regulators benefit by being better able to identify trends across firms and markets, and potentially being better able to assess the local and systemic risks associated with those trends. Standardization also facilitates sharing data so that firms and regulators can use and understand the same data within and across jurisdictions, reducing redundancy and misunderstanding.
At its best, data standardization can serve as a crucial ingredient in building and sustaining the trust, accountability, adaptability, and efficiency that are essential for finance. And such standards can be the building blocks for a revolution in regulation and regulatory reporting that will improve efficiency, just as the bar code has done for industry and consumers.
So if data standardization offers so many benefits, why is it so hard to achieve? Four obstacles stand out. First, the costs of implementation are borne by a few up front, while the benefits are spread widely over time. Standardizing data is not a free good. The costs of developing standards, testing them, retooling firm and regulatory systems to use them, and working out the kinks in implementation are considerable, and the costs of these investments are incurred early on. The benefits follow with a lag, and the benefits aren’t restricted to those who incur the bulk of the costs. Second, implementing data standards requires coordination from diverse parties, not only government and market actors, but various groups within those broad categories. Thus the problem is not only a coordination problem of the kind recognized by Mancur Olson decades ago, but also a cultural one.
In addition, many firms lack the technology and enterprise-wide data management and governance practices that are needed to allow them to use the data to better identify and manage their risks. Lacking those, the benefits from data standardization would seem far less than the costs. Fourth, in the United States, the balkanized regulatory structure is an obstacle to adoption of data standards. Regulators are free to specify any or no standards in requiring data reporting, despite the obvious benefits of using a common standard. Adding to the challenge, no higher authority can compel agencies to use a particular standard.
We strongly believe that solving these collective action and related problems is a role for governments. Designing and implementing appropriate data standards are important mechanisms through which the government can fulfil these dual roles of enhancing the efficiency of private activity and obtaining the high-quality information necessary to better serve the public at large.
US regulators and industry agree. The Treasury Office of Financial Research has promoted the use of data standards and the LEI in particular since inception. The Commodity Futures Trading Commission was early to adopt the LEI in swap reporting and is taking the lead in promoting appropriate standards for financial instruments. Since the ‘Linchpin’ paper was published in 2011, industry has strongly supported standards initiatives and provided leadership in this area. The Data Coalition, a non-profit supported by industry, has supported the need for data standards, both in financial services and in the federal government. The very recent signing (on 14 January) of the Data Act (H.R.4174 – Foundations for Evidence-Based Policymaking Act of 2017) is evidence that, even in divided government, persistence can persuade the authorities to do the right thing.
In Europe, the EU, ESMA, and other pan-European regulators have required the use of the LEI in all but a handful of reporting requirements, despite the fragmentation resulting from 28 sovereigns. And even post-Brexit, it seems likely that the UK will retain those requirements. That Europe has moved more quickly than the United States to impose a broadly applicable requirement that parties use the LEI highlights the challenge of entrenched interests and the way newcomers can leapfrog over incumbents. More specifically, Europe is in the position of trying to create a Capital Market Union. In contrast to the United States, where capital markets have long provided roughly two-thirds of all intermediation, Europe remains more reliant on banks, and capital flows across countries thus remain lower than would seem optimal. This creates the motivation needed to adopt new rules, at least some of which—like the broad LEI requirement—are closer to current state of the art than ones in the United States.
More broadly, we believe that data standardization provides a lens into some of the less appreciated challenges impeding effective financial regulation. Alongside capture and other common explanations for regulatory failures, we think the example of standards illustrates that coordination problems, delayed benefits, and other banal, but perhaps no less intractable, challenges are often the real impediments to better financial regulation.
As the pace of finance continues to reach ever more dizzying speeds, the value of high-quality information and the threats posed by information gaps continue to grow. Given the myriad frictions that stand in the way of optimal policy, leadership, creativity, and a willingness to look to the future and to work across firm, industry, and national bounds are critical to success. Some progress has been made already. More is needed, and we think possible. Only vision and leadership are the missing ingredients.
Richard Berner is an adjunct professor at the NYU Stern School of Business, and Kathryn Judge is a professor at Columbia Law School.
This post first appeared on the Columbia Law School Blue Sky blog here.
Share