Managed data services provider RIMES recently held a strategic advisory session in New York, where several C-level and other senior executives from some of the world’s largest investment managers discussed industry best practices.
Gartner expert presents key information
Mary Knox, research director of Banking and Investment Services at global consulting firm Gartner, provided a key-note. Giving a nod to her specialty in data management and data governance, RIMES brought Knox in to discuss reinventing industry best practices.
Steve Cheng, global head of data management solutions at RIMES, helped moderate the discussion.
Session speaks to data management challenges
As Knox outlined the variables impacting industry best practices, participants chimed in, giving some insight into the challenges they are encountering. Some noted the difficulties they are facing in their efforts to effectively leverage information, while others mentioned problems related to data storage.
Knox asserted that data management best practices are frequently based on outdated assumptions. The market expert listed several of these common beliefs, including:
More data is always better
- Data is a finite resource, so we need to hold on to every piece of it
- We control/own our data
- If there is one right answer to a question, we know who our users are and what they are trying to do with the information involved
A person responsible for data and vendor management at a New York-based investment manager pointed out the problems with the fourth assumption, elaborating on the specific challenges his company faced when developing its investments platform.
“One part that resonated with me as we were building out the platform was the user diversity,” he said. “We have a risk area with needs that are very different from accounting, versus credit research.”
“My job is to identify areas where I can bring in a standard and create centralization,” the representative added. “After that, we develop one-offs. It’s interesting that you brought this matter up, because it’s one of the areas where I’m facing challenges right now.”
The current situation
After listing these common assumptions, Knox cited examples of why they no longer hold true, including:
- The current environment is highly unstable, and any shifts affect how market participants manage data.
- Due to the rise of big data, the amount of available information has increased greatly. As a result, data is no longer a finite resource.
- The range of users has increased. The number of these important individuals is no longer finite. In addition, we may not know who these users are. The number of data alternatives we can use to address this situation has also risen.
- Investment complexity has increased, as financial institutions come up with more and different products.
Knox asserted that all these variables combine to make existing best practices unsustainable. The market expert noted that many buy-side firms currently take a fragmented approach to managing data. More specifically, she said that institutions frequently have:
- One staff member worried about risk
- Another employee focused on compliance
- A separate individual looking at other aspects of big data
3 key factors
The market expert listed three important factors that she wants buy-side firms to focus on.
- User diversity: Knox noted that if companies think of all different user types as being the same, they are trying to create a single version of the truth. The market expert emphasized that staff members working in the back and front offices, as well as their internal departments, can all be harnessing data. The same goes for staff members of regulators.
- Velocity: Business is moving quickly, and companies have to adapt to this rapid pace. Data is also moving in a very fast manner.
- Complex Relationships: This involves understanding the many different financial instruments that exist, the intricate nature of the markets and also all the separate types of legal entities that are around.
Data storage methods
Knox also spoke to the various architectures that companies can use to store their crucial information. Much of the available thought leadership focuses on whether buy-side firms should either centralize their information or spread it out.
The RIMES strategic advisory session took a different approach, discussing the costs and benefits of using a hybrid setup. Hybrid architectures spread data out over various resources, whether they are in-house or external.
Participants speak to hybrid architecture
Several round table participants elaborated on their data storage challenges. For example, a representative of a major global investment manager noted the difficulties her company has encountered by drawing upon a single data source. She asked what could be done to keep all the relevant information synchronized.
“When we had a centralized data source, all the different users had their own specific use,” the representative said. “They were pulling the information into their own database and transforming it.”
“When we asked our downstream users the same question, we would get different answers,” she added. “We are now looking at using a centralized model again, but harnessing web services to keep it synchronized … another way to go back to centralized data storage is to maintain very strong data lineage.”
One participant in the strategic advisory session asked the representative of the major global investment manager to elaborate on how she will use web services as an alternative. She elaborated on the different data towers she is setting up, and that her organization currently has no method for tracking data transformation.
“We’re currently building them out, so nothing is working 100 percent. In asset management, we have these core data towers. We have asset, master data, product, party account etc. We are building out an index and benchmarks core data tower. Because of the centralized model we used in the past, it was not supported the way it should have been. The data was transformed so much downstream, so there are a million different versions of the truth.
“It’s not that one is right or wrong, it’s that we don’t know why they got to that conclusion because there is no data lineage. What we are looking to do is build a data fabric that would sit on top of the core data towers and essentially downstream consumers such as risk or performance would say what is my risk for XYZ and the services would connect the dots over the core data towers and give it to them in a service format.
“It promotes the synchronization. There will probably be some storage downstream but you would have to go to the golden source. It’s about building that logic of transforming that data down the stream into that logical service so you understand why risk was changing the hierarchy of the account to XYZ.”.
A person responsible for data operations at a major global investment manager noted her company is working on a similar data storage solution.
“We are doing the exact same thing. We are creating a data hub – a bridge between the producer community and the consumer community,” she said.
“The example we use is simplified: Why go to the fish market, the vegetable market, the meat market, the fruit stand when you can go the supermarket? We are looking to create that supermarket so people can visit the one-stop shop,” the individual added.
“This way, we can then put the onus on them,” the participant continued. “If governance and the policy say you must consume from the golden source, if they want to transform it or get their information from elsewhere, it is their responsibility to explain why. When we had to do that ourselves, it was very time consuming. We are a very small team.”
Key importance of leveraging staff
The individual who works in data operations for the major global investment manager provided one more insight that buy-side institutions can use. She noted the value of spreading out projects over many different people, and harnessing their contributions.
“My biggest lesson learned would be don’t try to do everything yourself,” she stated. “What I’ve found is that it makes sense to have an architecture team that does lineage. It makes sense to have a standards team that creates standards. It makes sense to have a policy team that writes policy.
“It is important to keep in mind that there is a technology team out there that has business analysts and portfolio managers. There are people who are subject matter experts, the representative added. “What I’m getting at is leverage them. Don’t do other people’s jobs, or you won’t be able to focus on your own.”
What institutions can do to address their challenges
The strategic advisory session illustrated the challenges that buy-side firms face when determining how they will store and manage their data. They can hold their information in one central location or alternatively spread it out.
Institutions can also leverage a hybrid infrastructure. This approach involves storing data in multiple places, whether that involves in-house solutions or working with a managed services provider. For buy-side firms, this is frequently the most viable option, as it can help them overcome the challenges that come along with relying on fixed centralized infrastructures.
Organizations have many variables to considering when determining what approach they will use. Companies looking to figure this out might consider leaning on the three major concerns noted by Knox, including User Diversity, Velocity and Complex Relationships.
Buy-side institutions that want to conduct thorough due diligence when figuring out their data storage might consider working with a professional services provider like RIMES. By doing so, a company can quickly leverage broad industry expertise and gain access to proven methodologies.
- What Makes a Data Partnership Strategic?
- Full-Service Model: The Single-Platform Utopia That Can Leave You Wanting More
- Tap Managed Services to Solve and Scale for the ETF Data Challenge
- The FCA Highlights Importance of Robust Insider List Management
- ETFs and Transparency: Four Questions Institutional Investors Should Ask