Volumes, complexity and regulation of data are growing fast on the buy-side, with significant challenges resulting for all firms. The recent RIMES Data Governance Forums in Munich and Frankfurt, were designed to explore the need for, and the benefits arising from, ‘best practice’ data management. They showed firms how the buy-side industry currently fares when benchmarked against best practice standards, and what the road towards best practice looks like. Crucially, they showed how better data governance can improve business performance while meeting regulatory requirements.
An important new resource, given to all those taking part, was the new RIMES Data Governance Best Practice Handbook. This was launched in late 2013 following joint research by RIMES and Investit. 16 investment managers in the US, UK and Europe took part and gave insights into current practice and best practice.
Forum participants made it clear that within many buy-side firms at present, benchmark data quality management is often managed by the performance team. There is a tension between cost reduction and data efficiency, a lack of coherent data ownership and fragmented or siloed management with data managers who are divorced from core business priorities. There are also licensing issues: many people do not understand that authority to use data does not extend across every department. They wrongly assume that having the data within the company means it is free for anyone to use.
There was also a general consensus expressed that the buy-side’s current level of data governance maturity lies somewhere between ‘emergent’ and ‘work in progress’. Around half of firms are focusing cost management efforts on data governance. Overall, the drivers for better data governance that emerged were threefold:
- participants felt that their main driver was pressure from data vendors, rather than from the regulators or the current regulatory landscape
- there is a constant drive for further cost reductions
- firms want to mitigate the risk of fines for misuse of data
Complete compliance with data licenses was seen as somewhat unrealistic by the attendees; however data vendors are increasingly coming into firms to audit data usage in line with the contract. Improved data governance is an effective way to ensure that audits do not uncover problems, and it also allows firms to better manage and reduce data related costs.
Many steps are involved in data governance and processing, and asset managers tend to spend less time on governance and more on processing. Revealingly, 42% of Tier 1 participants do not monitor or control the use of data in a systematic way. 
The forums confirmed that some firms are on the way to establishing a central global team for data governance, but the practicalities of implementation are demanding and there is a need for detailed guidance. Data management teams would like to be closer to the new business development and sales teams; too often a new mandate comes in without other factors having been considered and without an answer to a key question: ‘Is it financially viable to take on board this new client?’
Forum attendees felt their companies had too many ‘strategic’ clients, and the new business development teams often are not concerned with where data comes from: they simply want to acquire the mandate. There is often little or no notice that new data is required – data needs to be onboarded immediately. There was a concern that there is no time left for the data management teams to negotiate contracts, which means they are at the mercy of the data vendor.
Decommissioning of data is also often neglected. Firms are good at onboarding new data, but can struggle with decommissioning when data is no longer required. As one participant commented, the most drastic measure is simply to turn off a data feed and see who complains, at which point a firm finds out who is using data and which feeds are redundant.
Benchmark Data Governance
The forums outlined many key strategies that need to be in place to ensure best practice data governance. Examples included the following:
Organization: A central global team with support from the appropriate executive authority. Consensus on where responsibility lies and authority to enforce governance.
Procurement: A formal process to justify acquisition, with a business case and a TCO approach to cost management. A firm-wide cost allocation model.
Usage: A decommissioning process for redundant feeds. Regular and formal vendor liaison and a process to monitor redundant, duplicate or inconsistent feeds.
Management Information and Compliance: Processes to confirm that use is aligned with vendor contracts. A maintained directory of benchmark usage.
Decommissioning: Process to decommission benchmarks identified as no longer required.
RIMES Data Governance Service
The RIMES Data Governance Service (RIMES DGS) offers a well-defined route to best practice index and benchmark data governance within an organization, and its key features were outlined during the round-table discussion. In conjunction with RIMES DGS, the Best Practice Handbook defines three key steps to implementing best practice across the organization:
- Taking stock – establishing the starting point and building an accurate and complete view of benchmark usage & costs.
- Getting control – achieving the cleansing stage. Reviewing details & usage to identify duplication & redundancy, and ensuring compliance with data originators.
- Moving forward – establishing ongoing processes and institutionalizing processes for the procurement, usage and decommissioning of benchmarks.
In combination with RIMES DGS, the RIMES Benchmark Data Service® (RIMES BDS®) handles all data collection and data transformation, delivering data directly to internal users across the enterprise.
- LIBOR Reform: What’s Next and How Can Firms Adapt?
- RIMES’ New ETF Service Scales Business Processes, Lowers Cost and Improves Risk and Performance Measurement for Buy-side Firms
- What Makes a Data Partnership Strategic?
- Full-Service Model: The Single-Platform Utopia That Can Leave You Wanting More
- Tap Managed Services to Solve and Scale for the ETF Data Challenge