Ask About Our Solutions

RIMES, Invesco, GAM and Catalyst Join Forces to Tame the Data Monster

On September 25, RIMES convened a webinar to discuss what asset managers can do to do achieve the Holy Grail of data management: keeping costs in check while maintaining growth. Giles Arbuthnott, Product Manager at RIMES, was joined by Naomi Clarke, Head of Data at GAM, Craig Pfeiffer, Data Integrity Manager at Invesco and Jonathan Hammond, Partner at research firm Catalyst to provide insights into how firms can successfully ‘tame the data monster’.

The webinar follows hot on the heels of the 2018 RIMES Client Survey, co-authored by Catalyst, which highlighted that managing costs while maintaining data quality remains the biggest priority for data teams at financial sector firms. During the webinar, attendees were also asked for their views on the most challenging aspect of managing market data. The results of the poll were categorical; with 50% of respondents citing licencing costs as their chief bugbear; followed by timely processing and delivery (23%), quality control (18%) and usage monitoring and control (8%).

One of the key topics discussed during the session was the role of product complexity in increasing data management costs. New and complex asset types such as derivatives and swaps demand more varied indices and greater volumes of data. Not only does this have an implication in terms of licensing costs, but also increases risk of data errors and thereby reputational damage: firms can ill afford to publish incorrect returns.

One of the dangers raised by the panel was that firms might be tempted to add ‘bolt-ons’ to their data management systems and engineer work-arounds to solve these challenges. This approach serves only to increase complexity, however; increasing risk as organizations find it harder to maintain control of their data estate. So, while highly centralized data architectures are good for maintaining control in the short-term, they lack the agility required to build a system that’s fit for long-term needs.

The panel then looked at the many costs – obvious and hidden – associated with data management. When it comes to upfront costs, licensing costs are, of course, a key consideration. However, many of the hidden costs of data management are equally important There are several important direct costs, too. One such is the cost of maintaining a large in-house data management team, but there are also lesser-known costs around the work performance and portfolio management teams often have to undertake to ensure data is fit-for-purpose.

The panel agreed that the best approach for firms to take is to rationalize their operating model. This involves reducing the number of systems and the reconciliations between systems, while taking account of the people involved in data projects; as often teams established for specific projects can take up as much resource as those required for ‘business-as-usual’ operations.

The discussion moved on to the relative benefits of firms managing their own data compared to outsourcing data management. The consensus was that a hybrid model is the best approach; whereby data management and governance can be provided through managed services for in-house teams to add value to and ensure oversight. The point was made that it’s how data is used – not the management of data -that provides a competitive advantage. Therefore, it should be ideally suited to delivery by third parties. In fact, the panel agreed that data vendors are essential in helping firms achieve scale.

Finally, the panel addressed the steps that firms can take to reduce data costs. While organizations are increasingly looking to create their own indices, they also need to try and reduce the costs associated with data licences. This involves ensuring that the projected profits of a new product justify the expense of the requisite data licenses; as well as good data management practices such as re-using data, ensuring no incorrect data gets into front-end systems and ensuring that different teams are not buying the same sets of data. Outsourcing data management also helps as firms are able to continually update their data universe without costs spiralling out of control – and this benefit is extending beyond index data to other areas such as reference data and security identifier management.

Watch the full video below or click here.

 The content provided in these articles is intended solely for general information purposes, and is provided with the understanding that the authors and publishers are not herein engaged in rendering regulatory or other professional advice or services. Consequently, any use of this information should be done only in consultation with qualified legal counsel. The information in these articles was posted with reasonable care and attention. However, it is possible that some information in these articles is incomplete, incorrect, or inapplicable to particular circumstances or conditions. We do not accept liability for direct or indirect losses resulting from using, relying or acting upon information in these articles.

Posts by Topics

Want to find out more?

If you have any questions about our thought leadership content, please get in touch.