On Monday, December 3 2018, RIMES’ COO Diarmuid O’Donovan and Buy-Side Technology’s Victor Anderson sat down for a “fireside chat” on the Waters USA main stage in New York City. The conference, an exclusive industry event, has occurred annually for more than a decade as the largest forum for CIOs and CTOs across North America.
O’Donovan and Anderson, joined by an audience of senior technology leaders, technology investment decision makers and C-level executives from the financial services space, came together to discuss recent findings from the Waters Chief Data Officer (CDO) survey. Following is a summary of their discussion.
Spotlight on ESG Data and the Rise of Alternative Content
A key point to emerge from the CDO survey was the evolution of alternative content–particularly ESG data, the related operational challenges of integrating that data into investment strategies, and therefore, the operational challenges of integrating the data into investment management operations. Despite the ongoing ESG content buzz, only two hands were raised when O’Donovan asked the audience how many firms were already using ESG content in aggregate.
There was consensus that – as with all benchmarks – success will come down to good data. O’Donovan stressed that if not handled correctly, the management of ESG data and other emerging alternative content can slow down operations and rapidly accrue high costs.
To avoid these pitfalls, firms need to ensure they have access to the best research and ratings on the market – a task that can be challenging given the relative immaturity of the market. Secondly, firms need to be able to tailor these feeds to fit the needs of their destination systems and business processes.
It should be noted that RIMES has increasingly collaborated with clients to integrate ESG indices into their operations, and we are currently working with many of the market’s preferred ESG providers.
Increasing Data Requirements, Mounting Costs, Depleted Resources
While data and technology spending continue to increase, many firms expect people costs to remain flat or decrease. This could mean that opportunities to innovate – and therefore differentiate – are often lost to those firms who continue to carry the burden of data management internally.
With this in mind, Anderson questioned the extent to which buy-side and sell-side firms could outsource many of the burdensome data management functions, which would be unlikely to provide them with a competitive advantage if managed in-house.
As RIMES has previously posted, the efficiency and accuracy of data management processes might provide some competitive advantage, but removing that from the equation, the incoming data provides no inherent competitive advantage, as it is the same for any company.
O’Donovan pointed out that it’s how the data is used that differentiates one firm from another. He noted that the volume and complexity of required data is tripling every year, and that every firm building or adapting its own in-house data management system, or EDM, is likely in a constant and costly repetitive cycle. The answer is to look to a solution that provides the efficiencies of the utility model with the control of an in-house solution, and with the agility and flexibility to adapt to an ever-changing data environment.
Innovation and Cross-Enterprise Transparency
With the above challenges in mind, Anderson asked O’Donovan to expand upon the importance of a trusted and transparent relationship between vendor, client and end users (the data consumers). Specifically: the extent to which firms can inform their vendors’ product calendars, and how transparent and flexible vendors should be when looking to improve services.
O’Donovan underscored the importance of collaboration between organizations and their vendors, noting that a true partnership between client and service provider can fuel growth strategies and lead to mutual innovation. This means ongoing transparency between both parties to ensure that the vendors’ services continue to scale with the requirements of the client.
Turning up at the end of the year with a renewal form based on a “stale” model is a recipe for disappointment. Further, conducting business in a bubble, with only the gatekeepers at the table can significantly limit the benefits of outsourcing and stem innovation.
As a managed data service provider, RIMES sources, collects, validates, remediates, transforms, formats, and distributes data – ensuring on an ongoing basis that inbound content is “fit for purpose” across each use case within the organization. To that end, O’Donovan stressed the importance of having key consumers of data at the table and part of the data strategy process – from discovery to implementation and beyond.
Given the time crunch, there were a number of crucial questions from the audience that were not addressed on the main stage. We are pleased to offer this ex post facto Q&A with Diarmuid O’Donovan, and welcome your own thoughts on these topics.
1. How do you architect access to a centralized data source to ensure performance consistency of availability?
We look at this from the point of view of performant consistency of data availability. With the RIMES Managed Data Service (MDS) we have the technology and expertise to manage and maintain data, including alternative data such as ESG research and analysis, from over 350 providers, delivered as a single, consistent source. What makes the service different is that we can tailor data to meet the specific data requirement of each system or end user across the business. We provide the most efficient and agile solution to the challenge of providing consistent, quality data across the business.
The issue with an in-house centralized data solution is that it becomes very complex and costly to maintain for every field required by each team. Taking ‘Country Code’ as an example, the same code type might be used in multiple different ways by each team, for example, country of incorporation, primary listing, or risk. Selecting the wrong field or code is a major commercial, regulatory or reputational risk and adding a new field inevitably requires expensive and scarce project time.
With MDS, we provide the specific fields required by each system and remove the need for a costly, inflexible centralized data source such as an EDM and warehouse solution to maintain it
2. If we got rid of Excel, and therefore the worst tool for proliferation of multiple and uncontrolled data sources, then wouldn’t our data arch’s be simplified?
Usually maintained by individuals and outside official monitoring processes, Excel spreadsheets are a risk when being used for critical data management processes. We provide a range of services to remove dependency on Excel.
When it comes to alternative data such as ESG research and ratings, our APIs for Python, R and MATLAB provide powerful tools to access and analyze market, alternative and index data from over 350 providers (subject to relevant licenses). We build custom databases to ensure individual end users only have access to the data they want, when they need it, and new data sets or fields can be quickly added or amended.
Another example of a common area where Excel is still used is in the management of multi-provider blended or composite indexes. We can manage these indexes on our clients’ behalf with the added benefit of robust license checking and expert advice on handling complex requirements such as what weights or returns to use, hedged components and managing exchange holidays.
Anywhere market, index or alternative data is being routinely manipulated in Excel, we can deliver a more robust, cost effective, transparent managed data solution.
3. If data management is a never-ending process excellence exercise, can you ever hope to get a positive ROI from it?
Calculating the cost of data management must include not just data license or data management team costs, but all the costs of:
• Managing, maintaining, validating and remediating the data,
• Licensing, installing and maintaining data management systems
• Business team time spent on data management and the operational cost of time wasted away from their primary role.
Once all those costs are taken into consideration, any investment that reduces the net cost returns a positive ROI. Improvement in data quality should be inherent in that saving as it reduces time spent on remediation.
Where data is being used to drive processes or for analysis, it will always be a cost. But with RIMES MDS that cost can be minimized to extract maximum value from your firm’s core financial services. MDS has been shown to deliver a positive ROI, within months of delivery, as the total cost of data management within the firm drops away whilst data quality demonstrably improves.
4. Do you have to undertake that process of “rewiring the bank” and getting data in order before you can make full use of third party vendors in this space?
As the quantity and complexity of data increases, especially alternative data, early movers are finding themselves investing heavily in building the systems and processes to store, maintain and provide access to it. In other words, they are “rewiring the bank”. Whilst the need for a part of the data is mandated, a significant proportion is still very much experimental in terms of its use in increasing investment returns, and therefore the costs sunken into these projects are high risk.
At RIMES, we store data from over 350 providers; our clients can access the data they need, subject to license, directly from customized databases built to their specifications or via daily feeds into the relevant systems. Investment required to ‘rewire the bank’ can be directed to the resources to analyze and profit from the use of the data, rather than on storing and maintaining it. And if the data does not provide the returns you hoped for, at least you do not need to write off the investment in the data management systems you implemented to manage it.
5. Do you differentiate between the data quality standards of data viewed as more critical for operations vs. less important? How do you decide?
An issue for an Operations Team is that maintaining data quality across all data sets may require the in-depth knowledge of the businesses who consume the data. However, the business does not want to spend their time remediating data quality issues. At RIMES, our experience and expertise is dedicated to ensuring data quality as the core part of our service. By providing quality data, fit for purpose and system ready for each business, we remove the pressure on the operations team to maintain data quality standards and maximize the efficiency of the business units to concentrate on their core roles.
*About Diarmuid O’Donovan
Diarmuid O’Donovan joined RIMES in 2017 having spent his entire career within the securities services and asset management industry. Prior to RIMES, Diarmuid was the Chief Data Officer at Legal & General Investment Management (LGIM), responsible for delivering an enterprise-wide information management capability and previously, Diarmuid was Global Head of Data at UBS Asset Management where he was a senior member of the middle and back office management team responsible for pricing, benchmarks, corporate actions and data management. He has also held several senior roles during a 10-year career at JPMorgan.
The content provided in these articles is intended solely for general information purposes, and is provided with the understanding that the authors and publishers are not herein engaged in rendering regulatory or other professional advice or services. Consequently, any use of this information should be done only in consultation with qualified legal counsel. The information in these articles was posted with reasonable care and attention. However, it is possible that some information in these articles is incomplete, incorrect, or inapplicable to particular circumstances or conditions. We do not accept liability for direct or indirect losses resulting from using, relying or acting upon information in these articles.