Outsourcing has a pedigree in investment management, having long been used to service a range of back- and middle-office functions. Today, firms are extending the use of outsourced services into areas such as trade and execution and the development of artificial intelligence operations. However, one area has historically been overlooked as a candidate for outsourcing: operational data management.
In many ways this omission is understandable. To outsource effectively, a business must first understand what good looks like for the function in question. This knowledge is the bedrock of successful outsourcing as it allows firms to design effective Service Level Agreements (SLAs).
In the past, when data was siloed within firms, it was impossible to gain visibility into data usage, which in turn made it impossible to draw up SLAs. That many firms now can, is largely thanks to a decade’s worth of investment in Enterprise Data Management platforms.
These platforms have been used to centralize data governance and unite what was previously a siloed data estate. This work has given Chief Data Officers (CDOs) and the data management function a clear view of what good quality data management processes and systems should look like.
However, over time the platforms used to centralize data governance have proved to be inflexible, time consuming to configure and unable to support exponential data growth. Once important enablers, they have become barriers to positive change.
The conditions are therefore ripe for outsourcing data management – CDOs and data managers now know what they need to outsource and how to put in place effective SLAs, and they have a compelling case for doing so.
In a world where data insights increasingly define success, firms need to ensure they have in place a lean, agile, high-quality data foundation – one that improves business outcomes, enhances adaptability and eliminates waste. It’s little wonder that nearly half of asset managers are considering outsourcing their data management within the next two years.
Diarmuid O’Donovan, Chief Operating Officer at RIMES comments: “Combined with the maturity of cloud technologies and the recent embrace of distributed working, the case for sourcing managed data services from specialist strategic partners has never been stronger.
“The next step is for CDOs to make the case to the business. This can be done by delineating core from noncore data processes, weighing the total cost of continuing to invest in technologies that constrain data against a managed service transformation, and focusing on the value-add that comes from partnering with experts in their field.
“What’s clear is that firms that move first in the adoption of lean managed data services will be set to thrive in the new age of data-driven financial services.”
To download a copy of RIMES’ new eBook: “How Chief Data Officers Succeed in the Data Driven Age”, click here.
The content provided in these articles is intended solely for general information purposes, and is provided with the understanding that the authors and publishers are not herein engaged in rendering regulatory or other professional advice or services. Consequently, any use of this information should be done only in consultation with qualified legal counsel. The information in these articles was posted with reasonable care and attention. However, it is possible that some information in these articles is incomplete, incorrect, or inapplicable to particular circumstances or conditions. We do not accept liability for direct or indirect losses resulting from using, relying or acting upon information in these articles.
- RIMES partners with AWS to offer its ETF data to the AWS Data Exchange’s millions of users
- Meeting the Ethical Obligations of Data Governance
- Constant Vigilance and Action are Crucial to Deliver on Diversity and Inclusion
- The ETF Market Calls for a Customized Approach to Data Management
- RIMES brings its ETF Data Management solution to Snowflake Data Marketplace amidst global ETF boom