The RIMES Forum at FIMA – London

At the recent FIMA conference in London, RIMES Technologies held a roundtable, giving participants a chance to weigh in on the data management and data governance difficulties they are encountering. Market data, data management and operational staff from buy-side firms in the area all attended this event.

The moderator provided a presentation on data management and data governance, and also prompted the audience for responses. He drew heavily upon research conducted by Gartner, the world’s leading technology research and advisory firm. 

While many market participants are relatively new when it comes to data management best practices, Gartner has opted to challenge some of the ad-hoc guidelines which can change from firm to firm. Gartner and RIMES have worked together at some of RIMES forums in Europe and North America. 

Traditional assumptions

Gartner challenges current data management best practices by arguing their underlying assumptions are outdated. These beliefs include:

  • Data is a finite resource
  • There is always one right answer
  • We can understand the data
  • We can control the data
  • More data is always better

Gartner has asserted that these assumptions must be challenged, since data management certainly isn’t becoming any less difficult. In fact, it is getting more complicated, at least in some areas. Gartner outlines three specific reasons why traditional best practices will not continue to work. 

  • Ever-increasing data volumes: Companies are working with an ever-growing number of sources, and the detail of their information seems to be constantly increasing. This proliferation of data is happening at the same time that many institutions are struggling to successfully manage their costs. 
  • Growing regulatory pressures: Buy-side firms are coping with an environment that is fraught with uncertainty. Many regulators seem to be collecting significant amounts of information, which leaves the financial institutions supplying it open to difficulty further down the line. 
  • Rising data use: Not only are institutions receiving more data, but users are placing greater strain on this information. In addition to the more traditional needs of the front office, as well as risk and performance, compliance and client reporting are also demanding this information. 

A few participants weighed, pointing out the challenges that have cropped up. 

“I don’t think there’s any one of those things we don’t go up against,” stated one attendee. “I mean, we’re very young so in terms of data governance, we don’t really have a problem there, but a lot of stuff we may have a problem without even knowing it.”

“But knowing the scope of data governance, whether its data management as a function itself and whether that affects the business or business operations and compliance, that’s where we struggle the most,” he continued. “Do we own that data? Do we have principles? Do we impose things on those areas or do we have a governance structure to manage that? That’s where our pinch points are.” 

A few other people at the roundtable mentioned some problems they were encountering with legacy systems. 

“I wish from our perspective that we didn’t have legacy issues,” stated one participant. “We migrated operations from London to Edinburgh, and in that we’ve inherited a lot of legacy challenges.”

Key assumptions

The presentation next moved to the three key assumptions that Gartner has identified:

  • There is a single version of the truth: The leading technology research and advisory firm does not support singular approaches, considering the diverse range of users harnessing buy-side data. 
  • Singular approaches are the way to go: Gartner asserts this isn’t the way to go when there are many problems surrounding volume and velocity of data, as well as time to market. 
  • Data Management and Data Governance are the same: “Understanding the difference between these two can be challenging,” the moderator stated. “There are a lot of complex relationships between instruments, markets and legal entities,” he added. 

Heterogeneity and data quality

The forum next turned to the homogeneity of a buy-side firm’s user base, or lack thereof. While many assume that these user bases are uniform, it is frequently the case that they are in fact diverse.

While many would define data quality in terms of accuracy and completeness, Gartner argues that quality centers around how fit-for-purpose the information is. Reading this data for users is a business issue, the moderator emphasized, noting that information quality might be far more important to the back and middle-offices than the front office. 

Participants gave their two cents on these matters, with one speaking to the difficulties his buy-side firm has faced in terms of achieving a unified push toward having one version of the truth. 

“That’s the issue that we find is that the Golden Copy is the way to go forward, defining that and storing it. The problem that we have is dividing it,” he said. “If we want to go back to the Golden Copy, I think that historically, people have siloed and done their own manipulation of data which has given their portfolio or their risk area a preferred view. It’s that consistency.” 

“…We are in the process of building a golden hub, but I know for a fact that once it goes live, there will still be maybe 20 percent of our business across three or four different satellites that will still go and manipulate their own data and come back to the data management team to question this data,” the attendee added. 

The participant emphasized that educating staff was crucial, and that teaching them about the desired outcome was important. He noted that many are reluctant to change. 

“It has not been effective in my positive way of thinking now,” the attendee stated.  “All we can do is keep knocking on the door and explaining where we are now, the benefits … I’ve worked at some companies where we put something through, we switch it across and day one it goes completely wrong, whereas in a company where we use that slow approach, that slow burn, we have goals, we have dates, but things can be changed.”

Data Management vs. Data Governance

The moderator emphasized that many confuse data governance and data management. He explained that the former puts the proper information policies and procedures in place, and the latter involves regular practices to ensure that data is used properly.

He also did a quick check in with attendees, asking who is responsible for data in their institution. The moderator asked if it is a board-level concern, or is still viewed as an IT/data issue. 

“I think the open view by a lot of people in the business world is that we as data management own that data,” said one participant. “Over the last 18 months, we have been re educating them through the Chief Data Office that actually no, certain areas of the business own that data.”

“We aren’t just the stewards, we are the guardians of that data,” he continued. “They think that if someone owns the system, they own the data in that system. We are trying to re educate them around that process.” 

Gartner recommendations

At the end of the forum, the moderator reviewed four recommendations put forth by Gartner.

  • Replacing the “either or” approach to data governance and data architecture with “yes, and” 
  • Create your architecture to support heterogeneity
  • Consider complex relationships, user diversity and velocity when investing in systems
  • Use data governance to increase access to information, instead of limiting it. 

Buy-side firms that want to develop the optimal data governance framework and combine it with strong data management must navigate a complex environment involving many variables. In addition, they must realize that proper data governance takes time, and requires stakeholders to provide sustained investment. 

If institutions want to make this situation easier, one option they have is obtaining an in-depth analysis of their current data management set-up from RIMES. By doing so, they might save themselves significant time and energy and increase the odds of developing the ideal environment for proper index/benchmark and other reference data use.

Posts by Topics