The RIMES Forum held in Toronto last September addressed two key issues in data management identified in a recent Deloitte study, Benchmark/Index data management and related costs.
The forum was an opportunity to address both the challenges and the opportunities for firms with respect to validation and governance. It also provided a platform for RIMES to share case studies and its growing basket of services that seek to improve efficiencies in management and costs.
. . . we’ve reached a critical milestone . . . moving from the theoretical, we’ve embarked on a lot of research, undertaken empirical analysis and met with clients to come up with more effective management tools, giving you an opportunity to take more control of the data management process.” Toronto Forum Attendee.
Data governance is one of the key pillars emerging from the recent Deloitte study. At a time when prices are rising on indexes and benchmarks are becoming increasingly more complex; and while data usage is becoming more constrained under partner licensing, many firms struggle to track and integrate these changes. There is also increasing attention from regulators, and this observation stimulated considerable debate among delegates concerned with the costs associated with benchmarking performance and the ways certain firms seek to allocate these costs or discount them in some way. While people are good at buying data, managing usage is more challenging – ensuring that one had benchmarks that one should be paying for and decommissioning those that no longer support funds or mandates. The challenge is to monitor and review the information one has and to establish a locus where responsibility and authority lay.
The research showed that 42% of tier one participants did not monitor or control the use of data. Data usage could change over time, different teams becoming involved in that data for different reasons and ownership of the data becoming more obscure or remote. In some firms, depending on size and structure, data management and processing could reside in the same team and delegates pointed out that while a number of teams would be using benchmark data, there was no clear indication as to which teams these were. It was also observed that firms could be paying for the same data multiple times and that there needed to be a centralized IT function in order to integrate this data and reduce usage anomalies, both in the interests of cost and of compliance. Best practice demanded that data usage complied with contractual obligations and that redundant data was removed. To do this, there had to be ‘buy-in’ at the top, recognition that compliance was an imperative and that costs could be reduced.
To address these best practice challenges, RIMES has recently launched the RIMES Data Governance Service, which comprises of three elements. Having checked and normalized a client’s data, the Repository stores the client files while the Directory is the catalogue, containing the metadata that can, for example, be interrogated when answering to a regulator. The Governor provides information reports and online tools, developed with the client to access the Directory and determine how the data is to be governed – how it is monitored and used and by whom it is owned. This facility is built in-house and reposited in the cloud and the client’s FTP folder and allows the client to do its own reporting or have RIMES report on the client’s behalf.
‘. . . there has been an emphasis for RIMES in particular to come up with more effective data management tools, giving you an opportunity to take more control of the data management process.’ RIMES
The Governor works like an intelligence centre, a reporting tool allowing clients to tap into their Directory, using it to generate reports or as a dashboard. A Master Report highlights the data source and partner, represents the family breakdown and, most importantly, the index versus the constituent level benchmarks in terms of both the standard benchmarks consumed and the custom indexes. A Usage Count Manager identifies which fund manager is using which indexes to track the data flow in detail. The Geographical Usage Report provides a broader perspective on index usage in terms of location, whether locally, nationally or internationally.
In conclusion, forum attendees were encouraged to reassess four critical milestones on this journey:
- to adopt an enterprise-wide view in addressing both data processing and governance;
- to build a strategic operating platform, able to cope with the increasing complexity and volume of data sets;
- to establish a clearly delineated and transparent locus of responsibility, whether an individual or group;
- to establish, a ‘mind-set’ around governance that is embedded, culturally as well as structurally, within the organization.
There were signs that organizations were not only taking the issue of governance more seriously but that it was playing a more central and substantial role within organizational hierarchies.
‘. . . we’re seeing very much a formalization of this enterprise-wide data governance operation . . . firms are starting to hire data governance officers tasked specifically to achieve maximum data governance.’ RIMES
- What Makes a Data Partnership Strategic?
- Full-Service Model: The Single-Platform Utopia That Can Leave You Wanting More
- Tap Managed Services to Solve and Scale for the ETF Data Challenge
- The FCA Highlights Importance of Robust Insider List Management
- ETFs and Transparency: Four Questions Institutional Investors Should Ask