FRTB: BANKS MUST LEARN ITS ABOUT THE DATA & IT’S GOING TO COST

The fact that FRTB should have happened 20 years ago, and would have ameliorated some of the effects of 21st century financial disasters highlights its necessity, however ‘Better late than never’.

On 22 March 2018 the Basle Committee finally acted and wisely delayed the introduction of the Fundamental Review of the Trade Book (FRTB) by postponing to 01 January 2022.

The original live date 01 January 2020 was unrealistic given the MiFID2 sized black hole created for Financial Institutions in terms of resources and expertise.

Please click here to access pdf with graphics.

The data elements of FRTB are critical to its success as a regulation, yet this appears not to be fully understood in terms of its enterprise wide impact and application. With FRTB and other regulations, the regulators have shifted the data benchmarks.

Before: Regulators would set a benchmark of what must standards must be met.

After:    Regulators are now defining the criteria of how these benchmarks must be met.

This dramatically raises the bar on data sources’ integrity and quality assurance which means resources, effort, and costs must be devoted to understanding what is required and how to meet these requirements.

This means we need to ask questions concerning exactly what a Bank’s data requirements are going to be. Here we can only scratch the surface.

Question. What does FRTB require in real world data terms?

There are 5 market data aspects that require consideration.

In addition, there are 8 critical features driving the data requirements of FRTB.

Question. What are FRTB’s direct data impacts and consequences?

Question. What are the fundamental data requirements?

Based on the questions asked, there are 4 sets of parameters for data. These are:

Question. What does this mean?

Banks need to address their data requirements from the strategic level to the desk level. This also requires Banks to understand that data is not only a cost, but it is also a very important resource. This requires new approaches to data sourcing and management, in other words the existing frameworks for data management across the business are becoming obsolete.

This is not only driven by more intrusive regulatory data requirements, such as FRTB, but driven by the advent of ‘The Cloud’, ‘Big Data’, and access to new data sources in the forms of traditional ‘structured data’ and unconventional ‘unstructured data’.

Question. What needs to be done?

There has been insufficient communication globally on what FRTB means and what is required to implement. IOSCO and global regulators need to promote FRTB better, especially outside the US, UK and EU, to markets where there is less knowledge about, and even less understanding of FRTB.

There is still a lack of preparedness within the financial community, especially when evaluating the data requirements, so there needs to be greater education regarding what is expected to become FRTB compliant.

Greater awareness of the data impacts of FRTB especially when benchmarking the data that is required to be FRTB compliant.

The danger is Banks, especially those most at risk, will subscribe to cheaper data to populate their FRTB models as a cost cutting exercise, ignoring the principle of ‘rubbish in, rubbish out’.

Banks must implement a comprehensive, coherent data strategy avoiding over-complication. Emphasis needs to be placed on Data Sourcing and effective data quality assurance with continuous validation.

Keiren Harris

knharris@datacompliancellc.com

www.datacompliancellc.com

www.marketdata.guru

Leave a Reply

Your email address will not be published. Required fields are marked *