Asset Managers Struggling to Exploit Full Value from Data, Finds Survey

Responsive and robust data management strategies are proving crucial to asset management success, according to new research findings released by Northern Trust.

High quality data that can provide actionable insights is central to asset managers’ key focus areas: improving investment decisions, assessing and managing risks, meeting client service expectations and satisfying growing regulatory demands.

Yet the survey found many asset managers, “confronted with a deluge of new information arriving in a variety of different formats,” are suffering from an embarrassment of data riches. More than a quarter (28%) of respondents said data inundation made it hard to determine what was useful. Only 13% claimed they successfully capture the full value from their data, while 85% said they do so only fairly or somewhat well.

The root problem, noted Northern Trust, is that “asset managers acquire data from providers that have collected it for various purposes using many different methods.”

And that has a cost: 36% of respondents complained that data required significant scrubbing or processing, while 29% said data was presented in non-compatible formats. As a result, asset managers are forced to devote significant resources to “processing and scrubbing purchased data, dealing with incompatible formats and separating useful from non-useful data.”

Data quality is another problem, with one in five respondents stating that incorrect data is one of their top challenges.

As the volume, velocity and variety of data that asset managers need to manage continues to grow, a flexible data strategy that enables firms to respond to opportunities and tackle the challenges of increased regulation, competition and cost pressures will be increasingly critical, notes the report.

To this end, Northern Trust sets out a four step “best practice” solution:

  1. Aggregation: Capturing data from multiple sources, centralizing the data and providing a presentation and data-delivery layer.
  2. Normalization: Translating and formatting data from each source consistently, enriching the data by sourcing missing values.
  3. Verification: Validating the accuracy of received data through reconciliations, overriding valuations where needed, running data-quality checks and resolving any issues or discrepancies.
  4. Integration: Integrating systems, business processes and data-delivery formats to enable seamless delivery to external systems and to make the aggregated data source the “golden copy” of data for counterparties.

For asset managers to develop such best practice data management capabilities in-house is no easy task. Efficiently collecting, normalizing, storing and distributing the data they need takes substantial resources – resources that can be more effectively deployed on their true core competencies.

But there is an alternative. By leveraging a professional managed data service, asset managers can reduce the total cost of ownership of their data, while significantly improving both data quality and responsiveness.

At RIMES we can help you measure the potential ROI for your organization of a Managed Data Service. Read more here.

The content provided in these articles is intended solely for general information purposes, and is provided with the understanding that the authors and publishers are not herein engaged in rendering regulatory or other professional advice or services.  Consequently, any use of this information should be done only in consultation with qualified legal counsel.  The information in these articles was posted with reasonable care and attention.  However, it is possible that some information in these articles is incomplete, incorrect, or inapplicable to particular circumstances or conditions. We do not accept liability for direct or indirect losses resulting from using, relying or acting upon information in these articles.


Posts by Topics