During the recent RIMES Forum held in Boston, representatives of investment managers and custodian banks got a chance to share their views on some of the data challenges they are facing.
Many of them spoke to the difficulties they have been encountering as a result of the varying priorities of business users and those working in IT. Others mentioned the pressures they encounter in their efforts to deliver significant amounts of data at high velocity.
Gartner, the world’s leading technology research and advisory firm, has taken a unique approach to this situation, challenging assumptions fundamental to existing data management best practices.
To shed some light on the background supporting this view, RIMES has held numerous round tables across the world. At the recent event in Boston, Mary Knox, director of research for Gartner with a focus on Banking and Investment Services, helped clarify this outlook by giving an interactive presentation.
Knox started out by offering a glimpse into the current data management landscape, stating that current best practices may be ineffective since they are based on outdated assumptions, including:
- More data is always better, and data is a finite resource
- Organizations control their data
- There is a single version of the truth
- Users understand available data
- We know our users
Knox emphasized that many buy-side firms have more information than they know what to do with, which brings the first assumption into question. In addition, companies are culling more data from external sources, which requires them to surrender some of the control they hold over their information.
Finally, she emphasized that institutions have different user groups, which have varying data needs, and this frequently results in information being formatted to fit their preferences.
Buy-side firms are encountering three core sets of challenges:
- About data: Increases in the volume, velocity and type of info
- Requirements: Cost, security and regulations
- Opportunities: Complexity and technology
Knox took a few minutes to elaborate on the varying data management difficulties institutions can encounter, and then asked the participants if any of them sounded familiar.
One participant, who works for an investment manager and focuses on Enterprise Data Management initiatives, identified a few challenges that sounded familiar to him.
“A couple of these things really resonate with me,” he said. “The complexity of the data relationships is always a challenge. That requires the business and the technical folks to work really closely together. That’s one thing.”
“The velocity of the data is spot-on,” the attendee added. “Our SLAs to deliver the data are getting shorter and shorter while we are gathering more and more data, so it is a real challenge to deliver that data in a timely way, especially since our sources of data are external entities.”
Another individual, who heads up EDM at the same investment manager, also spoke to the difficulties he has encountered.
“One thing we saw with the diversity is that we are seeing a bigger item that comes up between a portfolio manager and compliance,” he stated. “Sometimes you can see those two at odds, where they want to be able to trade on X, but they can’t because of compliance and regulations. … We are beginning to see more of those in these hierarchies. They are becoming more and more prevalent.”
When different employees have to consolidate data on a single platform, it can be difficult, especially if the staff have not had to work together in the past, Knox stated. She said that arriving at a consensus can certainly be a challenge.
One attendee, who manages the internal consulting practice of an investment manager, mentioned problems in another area, referencing difficulty with regulations, data and availability.
“The timing of the data and the availability of the data is a real challenge for us,” said the individual, who focuses on accounting, performance and data management. “We run several organizations that trade 24 hours a day. We pass the book from region to region, and have multiple end of days, depending on the region, availability, pricing and corporate actions.”
“The other is that from a regulatory standpoint, our organization is somewhat unique in that we have 16 or 17 subsidiaries, and we have to aggregate all that data together every day,” he went on to say. “That is also a major volume challenge for us to get all those data points in and be able to act on it from a regulatory perspective.”
Heterogeneous audiences and their challenges
Many companies have traditionally used a fixed-data model approach to structuring their data, and this approach is being challenged as a result of organizations that have a heterogeneous user base.
For example, many buy-side firms have compliance staff who insist on having very high data quality. Alternatively, employees working in marketing rely far less on having information that is accurate.
Absolutist approaches to data quality
Under the Absolutist approach, data’s quality hinges on how fit-for-purpose it is. Under this definition, quality may have very little to do with the accuracy or completeness of the information. In addition, one department of an institution may view data as being far more useful than another business segment.
Data quality challenges
After reviewing the aforementioned models, Knox asked participants to weigh in on the data quality challenges they are encountering and what they have been doing to bridge the business-IT gap in terms of what makes data either high quality or fit-for-purpose.
One participant noted the importance of knowing the effect certain information has.
“It’s kind of a challenge to understand what the data has impact to,” he stated. “Let’s say you have a hundred columns of data, and five are key. Those are certainly the ones you should be concerned with.”
“The ones on the right hand side might not be as important,” the individual said. “They may show up somewhere, but they won’t drive the process in the wrong direction, they won’t get a wrong calculation.”
He emphasized that the IT guy is not responsible for what information is crucial to a specific usage, and what data may not be as important. Instead, determining which data is integral to any single use is a business conversation.
Data governance and data management
After delving into the challenges that many buy-side firms face in their efforts to maintain high standards for data quality, the presentation took a look at data governance, data management, and how they intersect. While the first puts specific rules and regulations in place for information, the second involves stewardship and day-to-day operation.
In addition, Knox took a moment to consider different perceptions of data governance. She stated that currently, many view these frameworks as important to maintain control, rationalizing the situation as being more of a data vault.
One participant emphasized that many are confused about what place data governance has in organizations.
“Here is everyone’s question,” he started. “‘It seems like you guys are on an island. It seems like you guys are separate from operations. You influence them, you influence business and you influence tech, but you are kind of like your own entity.’ There are always questions like where does that fit.”
Knox suggested a different approach, where buy-side staff think of data governance as a means of openly accessing information.
How buy-side firms can prepare
Institutions currently face diverse challenges. There are many different ways they can use data management and data governance to overcome these difficulties. If buy-side firms want to increase their chances of using data to meet their business objectives, one way they can do so is speaking with RIMES Technologies.
By doing so, they can leverage the expertise of a managed services provider to determine the optimal way to reach their objectives.