Finance firms and banks have explored various ways to improve internal data management, and now they have resorted to creating new entities to scrub data and provide tailored feeds. Data centralization makes the most sense anyway you look at it, but perhaps the solution needs to move further outside their walls.
Data centralization is always a good idea
Irrespective of industry, many companies struggle with their data management activities. Information systems have become more sophisticated, and collecting and sorting data from disparate feeds and systems is no easy task. According to Effective Database Management, having a centralized system that joins together all these feeds and datasets in one location is usually the first step in the right direction. There are numerous benefits to streamlining data management under one system.
“Data is only as useful as its accuracy.”
The first and foremost consideration should be data integrity. Data is only as useful as its accuracy and the ease by which it is accessed. When organizations receive information from multiple feeds, data integrity issues come to light. In the finance industry, asset managers face this dilemma when receiving benchmark and index data from more than one provider. As a result, companies are required to scrub, sort and manage the data internally, which opens up the possibility for all sorts of errors and inefficiencies. On the other hand, data centralization lends itself to easier data access, use, verification and reconciliation. Also, with all eyes on one system, should data management processes require amendment, that effort is made much easier on one supported platform.
Finance firms are beginning to understand
In the financial industry, the need for better data management is obvious. Giving credence to this notion is the fact that investment banks Morgan Stanley, J.P. Morgan Chase and Goldman Sachs are currently working together to create their own data management company. The purpose of the new entity will be to gather and clean reference data for more effective utilization. The Wall Street Journal pointed out that in addition to improving the access and use of financial data, the new company will also significantly reduce the amount of money spent on internal data management at these respective firms.
This news speaks to the need today for better control of data in the financial industry. Asset managers already have to manage immense index and benchmark datasets internally, taking attention away from value added activities. Now, in the face of increasing data governance regulation, accuracy and transparency is more important than ever. Additionally, given the high cost of internal data management, especially during a time when finance firms should focus on staying lean, finding alternative solutions is warranted.
The Wall Street Journal noted that the project is called Securities Product Reference Data, which should be fully operational in the next six to 12 months. The new entity will focus on providing reference data of financial instruments, starting initially with equity, derivatives and fixed income data. Representing a function that firms and banks have typically done, the SPRD aims to improve the process by which data is gathered, scrubbed and made consistent for use across an organization.
Is there an even easier way?
Since the financial crisis of 2008, finance firms have banded together to create new entities, save money and meet regulatory requirements on several occasions. The Wall Street Journal pointed out that data management spending has usually been in the tens of millions for finance firms, and for large banks, up to hundreds of millions. The SPRD is meant to help bring some of those costs down.
The SPRD will create tailored data feeds for clients, using the same sources of data that firms and banks rely on. Essentially, the new entity will take over the scrubbing activities for the investment banks involved and reference data will be delivered ready to use. That data should allow banks to price trades more accurately and avoid any risks or compliance issues that could result from inconsistent information. U.S. and European regulators have already put increasing pressure on banks to collaborate and improve data accuracy, noted the news source.
However, it is important to point out that creating new entities to scrub and manage data for the purpose of receiving tailored data sets is not the only option out there. Certain companies already specialize in managed data services and are able to provide finance firms and banks with tailored data feeds. Outsourcing data management makes a lot of sense in the financial industry because organizations want to become more lean. As such, asset managers should relinquish control of index and benchmark data to third-party entities, given that these companies already exist and are good at what they do.
Ultimately, data management should put accuracy first, and cost second. If outsourced providers are able to deliver reliable data at reduced cost, why not rely on their services?
- Innovation Vs. Standardization: Striking the Right Balance in ESG Regulation
- Consensus Economics’ Forecast Data Available on RIMES Managed Data Services
- RIMES Grows Board with Two New Directors
- RIMES’ RegFocus BMR Wins at the RegTech Insight Awards 2020
- Dividend Cancellations Add to Benchmark Data Complexity