Managed data services provider RIMES recently brought its industry round tables to Los Angeles, where buy-side representatives got a chance to give their two cents on the data management challenges they face.
At this round table, individuals working for investment managers in functions such as index solutions, business analysis and market data services provided their input. Some pointed out that many institutions are facing the dual pressures of data quality and high velocity, and these two objectives can easily conflict. Others noted how regulations impact their IT processes, and some weighed in on existing data architectures.
In addition to gathering the input of buy-side participants, the Los Angeles round table also presented key takeaways from a strategic advisory session that Gartner, one of the world’s leading information technology research and advisory companies held in New York City in May. Mary Knox, research director of Banking and Investment Services, covered various matters that are crucial to data management.
Velocity and its impact on data quality
One major concern identified during the round table was the consistent emphasis on velocity, and how this need for speed can compromise data quality. Many participants recounted tales where their coworkers made a request for information, and then promptly stated they needed it yesterday.
This focus on delivering data quickly can create challenges elsewhere, a senior IT executive working for an investment manager noted.
“We moved within the last four to five years to an internally developed security reference data system,” he stated. “At the same time, in the last few years, we have moved to pushing the compliance checks up earlier and earlier in the process – from post-trade to pre-trade, to now actually embedding them into the trading systems to help with allocation and investing decisions.”
“The problem we run into is new instruments. The rest of it is mostly straightforward,” he continued. “Nobody wants to wait to clear a trade, but if we buy a trade with incorrect information, we pay a price for that. So there is this balance between how quickly you release the data to allow people to move on, versus holding up a trade, versus making sure you get it right.”
Regulation a key driver
Los Angeles participants got a chance to comment on how pending regulations – including Solvency II and The Foreign Account Tax Compliance Act – can impact their information decisions.
The senior IT executive who spoke earlier weighed in again, emphasizing that these new restrictions are affecting various facets of his business.
“It’s a driver,” he stated. “It’s not really pushing our demands only in terms of reference data – settlements, operations and trading activity more so than just in market data.”
Data architecture a key consideration
Another key consideration for buy-side firms is what type of data architecture they will use. Figuring this out requires reviewing various approaches, carefully assessing their costs and benefits and then determining which one is the best fit.
During the round table, the moderator noted that many buy-side firms are moving toward centralization. They want to get all their key information in one place, and then figure out what they will do with it. While this approach may sound like the easiest route, various participants in the LA round table emphasized the associated challenges.
One attendant, who works for the market data services group of a major investment manager, noted the difficulties her firm had in its efforts to develop a single, centralized system.
“We were trying to create something that was going to meet everyone’s needs, and then it turned into a 65-phased build out,” she said. “I think that they realized they needed to chop it up … they wanted something that was going to have all of our reference data, plus index data, plus this, plus that – but once you started looking at everyone’s needs, it didn’t make sense.”
Another participant, who is a business analyst for a mid-sized investment manager, also pointed out the challenges of finding a single solution that fits everyone’s needs. Previously, his company had everything decentralized. However, the buy-side firm has moved toward consolidation since then.
“Even after the past two or three years, we have continued to find little pieces of an access database here and we have converted these as they came up to a centralized place,” he said. “Just having a centralized solution is not necessarily going to meet everyone’s needs, given everyone’s user input.”
Buy-side firms that are looking for the optimal data architecture might consider using a hybrid setup. This entails working with a multitude of systems – whether they are internal or external – instead of keeping an organization’s information in one place.
By using this approach, buy-side firms can spread their data out, and therefore circumvent the various challenges that can arise by drawing upon a centralized source.
When deciding upon a particular data architecture, institutions have many variables to consider. One way they might be able to expedite this process, and make the most-informed decision about their infrastructure, is getting an in-depth evaluation from a professional services firm like RIMES. Buy-side firms interested in conducting the proper due diligence should seriously consider taking this route.
- What will Replace Libor and Eonia?
- The FCA Turns Up the Heat on Market Abuse
- Service Spotlight: New ESG Indices, A Series of Fund Launches and Seemingly Impossible Deadlines
- [INFOGRAPHIC] The Trends, Challenges & Solutions to Maintaining Market Integrity
- European Commission Adopts BMR Technical Standards