On June 20, RIMES hosted its third Client Conference in Boston. For the first session of the event, Jonathan Hammond, Partner at Sionic (formerly Catalyst), presented a sneak-peek of the findings of the annual RIMES client survey. He was then joined on stage by Mark Rothermel, Lead Manager AVP Performance and Analysis at T. Rowe Price, and Andrew Barnett, Global Head of Product Strategy at RIMES, to discuss the findings. What follows is a summary of the points raised during the debate.
Every year, RIMES interviews clients from across the financial services sector, representing a spread of firms from investment and wealth management through to custodians and asset servicers. This annual poll provides insights into global data management trends. Ahead of this year’s report, Jonathan Hammond previewed the following findings:
- Data management teams are mostly small in size. Data teams in the Americas region tend to be smaller than the other regions with nearly 70% of data teams made up of just 10 employees compared to about 45% in Europe the Middle East and Africa (EMEA).
- Only 8% of data management teams report to a Chief Data Officer (CDO). Just over half of data management teams report to Operations or the Chief Operations Officer (COO), but there are a variety of other reporting lines in existence, such as technology (12%).
- Data quality and cost are key priorities. Approximately 65% of firms have issues with data quality and are expecting to prioritize this in 2019. 58% of firms plan on centralising data sourcing and distribution in 2019.
- Data governance is a cause for concern. Most firms surveyed agree that lack of data governance/ownership or inconsistency across data platforms are the top data challenges they face.
- There’s little appetite for blockchain. 50% of firms showed no interest in blockchain whatsoever and 25% of firms did not think AI was of interest to them. Data analytics/visualization and APIs have the most penetration, with a significant number of firms interested in their use (45%).
During the discussion that followed, the panellists debated the possible reasons why there is such a wide variation in data operating models around the world. These reasons included different organizational cultures and the different technology lineages in use at companies. It was also noted that as financial data is difficult to centralize, and there are at times no ‘owners’ of the data, it is challenging to put in place any sort of standardized model.
The size of data management teams
With regards to the difference in the size of data management teams in the US compared to EMEA, the panellists agreed that there was likely a difference in the respective functions of the teams operating the data. In the US, certain data management functions might be represented in other parts of the business not included in the survey, for example.
Another possible reason is cultural. In the UK, which was represented in the EMEA findings, there is a tendency for legacy data management projects to remain in place even as new consolidation projects are launched. This leads to a layering of data functions in the firm, with each requiring staff to operate the model.
Building a data management case
The panel then went on to look at what lessons can be learned for firms’ data management initiatives. The key is to bear in mind that data management isn’t a one-off project and that data managers need to think of ways to prove value for new project early on. Data needs to be accessible immediately by the people who can use it to generate value – it’s not acceptable to leave it sitting there in a data warehouse. One role of the data management team is to ensure data can flow seamlessly across the organization so it can be used quickly.
Data governance comes into view
Another topic to come up in the research is the importance of good data governance. The key challenge in formulating a data governance strategy is that it is often unclear who ‘owns’ the data, who the data steward is, and who uses the data. All must be identified for a good data governance framework to be established.
The good news is that while the business benefits of effective data governance have in the past been difficult to quantify, this is changing. Strong policies and frameworks allow for data federation, which, where good stewardship is in place, can enable self-service; accelerating operations and driving efficiency.
The session concluded with discussion around the area of data management outsourcing. It was agreed that as many data management processes are non-core, it makes financial and operational sense for firms to outsource them to experts. The more managed data service providers can connect data with other data, the more valuable it comes to firms. If, for example, a benchmark master can be provided to a firm, packaging the benchmark and reference data, the firm is then free to focus on other, more valuable tasks.
The content provided in these articles is intended solely for general information purposes, and is provided with the understanding that the authors and publishers are not herein engaged in rendering regulatory or other professional advice or services. Consequently, any use of this information should be done only in consultation with qualified legal counsel. The information in these articles was posted with reasonable care and attention. However, it is possible that some information in these articles is incomplete, incorrect, or inapplicable to particular circumstances or conditions. We do not accept liability for direct or indirect losses resulting from using, relying or acting upon information in these articles.
- Financial Sector Regulations in Australia: The Aussie Rules are Tightening Up
- Panel Discussion: The New Normal
- Consolidation in Australia’s Superannuation Funds Market: Data Quality Comes to the Fore
- Benchmarks Regulation White Paper Update
- RIMES awarded Best Ops Data Management Solution for the fifth time