The effective and efficient management of index and benchmark data has become a key operational challenge for investment managers worldwide.
Increasing concerns over the complexity, volume and rising cost of benchmark provision are focusing efforts on how to ensure the balance between service quality and cost control, and that benchmark data is being used in accordance with the licensing agreements. Additionally, external pressures arise from an increasing regulatory focus on the quality of data used to support key processes and reporting and the overall control environment. In combination these factors are forcing managers to review their approach to managing index and benchmark data.
In June 2011 Investit published a research report on industry issues and trends in the area of index and benchmark management. The report highlighted 3 key findings:
- There is a broad consensus among managers that it is essential to have systems and processes that ensure the quality and consistency of data, a managed daily delivery service and automatc formatting of data from multiple sources.
- There is serious concern over the cost of benchmark data management; however the focus was on the external cost of data licenses, deflecting attention from the internal cost of resources and technology, typically a multiple of the license costs.
- There is a need for an objective mechanism to measure the maturity of managers’ processes that replaces subjective internal perspectives which distort the true picture.
A second study followed in February 2012 involving 18 investment management firms in the US, UK and Europe. In this study we developed the ideas presented in the original report into a Best Practice Framework for benchmark data management, including a description of how current market practice compares with best practice.
This study showed that there is a general weakness in the area of data governance, in particular around Usage Management and Decommissioning, namely an absence of robust business processes to monitor and rationalize the use of benchmarks.
Overall the weaknesses expose a general lack of control over acquisition, usage and retention of benchmarks. This results in the strong probability that benchmarks are maintained that are no longer in use and that benchmarks are being duplicated with the corresponding impact on overall operational efficiency. As a consequence there is the additional cost of data licenses and the internal cost of processing; namely the Total Cost of Ownership of benchmark data can be significantly higher than it needs to be. Additionally, without appropriate controls benchmarks may be used outside the terms of their licensing agreements, potentially exposing firms to additional license fees. Furthermore it will be harder to demonstrate compliance with regulatory obligations.
- Full-Service Model: The Single-Platform Utopia That Can Leave You Wanting More
- Tap Managed Services to Solve and Scale for the ETF Data Challenge
- The FCA Highlights Importance of Robust Insider List Management
- ETFs and Transparency: Four Questions Institutional Investors Should Ask
- EU BMR: Sell-side in the crosshairs