The RIMES forum on Benchmark Data Governance, held in Geneva in October 2013, stimulated lively discussion among a broad spectrum of delegates and was an opportunity to consider the results and recommendations of a recent Deloitte study and how the industry has responded.
The Deloitte study, Benchmark and Index Data Management and Related Costs, anticipates both an increasing demand for benchmark data and a greater complexity with respect to its breadth and detail. The study allows firms to achieve greater accuracy and compliance while managing cost. The research was carried out among 13 companies in Europe and RIMES has been allowed to use this research in its forums. This was also the first time Deloitte had addressed data management for benchmarks and indexes, which, while increasingly important, has been under-researched.
Increasing regulation demands that firms better understand where data sits and who is using it as well as the more obscure costs associated with its use. Deloitte sought to discover who is using this data, what is being done to ensure compliance and what decommissioning processes are in place. The study also addresses the issues of full-time equivalents (FTEs) and the channels through which firms receive data, i.e. their sources.
Deloitte identifies two distinct processes: data processing, comprising collection, validation, transformation, storage and distribution, and data governance, comprising acquisition, monitoring and compliance. Firms want both to rationalize the number of sources and benchmarks used and the number of FTEs involved. At the time of the study, 42 per cent of participants did not monitor or control data usage; and while an average 1.7 FTEs took care of governance, 0.9 per cent focusing on acquisition in governance, 3 FTEs were focused on processing, 0.9 per cent of whom focused on validation.
The experience of delegates varied considerably with respect to the number and level of indexes taken and the sources; the degree to which processes are automated and the number of FTEs, assuming this is known or has been estimated. Similarly, there is a degree of uncertainty as to who is using the data, which necessarily impacts on governance and on cost.
‘. . say we have a tool, and there’s a manager who needs to benchmark his portfolio. He will create his index based on some of them [but] we have no clear vision on what is being used and created each day. We don’t have any reporting on it.’
There is a tendency in some firms for FTEs to accrete to sources or indexes and benchmarks rather than to function across an organization, the consequence of which is a disproportionate increase in cost and increasing dis-integration of managed control and governance.
‘. . there are several different users in the same unit, so it’s a bit hazy. And then when the people don’t need the data anymore, we are not notified . . .’
Inadvertently, a culture dependent on communication neglects to champion it internally; costs are duplicated, hidden and non-attributable, while responsibilities are fragmented or obscured.
‘. .the number of persons increases with the number of sources, not necessarily with the increase of business.’
Nonetheless, delegates were keenly aware of the need to rationalize processes and costs and there is a trend, if sometimes tentative, towards more integrated management of data; essentially, a greater emphasis on shared information within a controlled architecture.
The Deloitte study reflected the experiences of delegates in terms of the evident differences in approach to data management and the need for greater coherence. The discussion turned to data use and compliance and the difficulties encountered where data is not centralized and monitoring its use is fragmentary. Deloitte demonstrates that companies in their middle range with ten to twenty sources are more focused on compliance. Some firms have integrated databases that track access to and use of data; but this does not always allow timely intervention when compliance issues emerge. Access rights are often restricted; nonetheless, where clients are more circumspect about the use and distribution of their data, greater transparency within firms becomes a priority, demanding not only more robust but more centralized controls. Even if firms have the means to control access to data, this might only apply to particular levels, and contracts can still be compromised. Equally, where there is no notification protocol applied to redundant data, there are cost as well compliance ramifications.
‘. . since this climate of cost control. . we [manage] by group or department, then we associate this with our governance group for invoicing, redistribution [and] payment, [agreeing] in advance on what group will pay what.’
Implementing compliance controls over contracts is made more difficult when additional terms apply regulating, for example, use and location. Cost follows complexity while management protocols more often focus on cost efficiency. Even where more centralized governance systems have been implemented, the focus on cost might still allow compliance issues to go unaddressed.
Decommissioning and a client’s awareness of the relative value of the indexes and benchmarks they require also have a significant impact on cost. Without proper controls on both sides, otherwise redundant yet costly data can go unnoticed or clients will demand benchmarks that they either do not need or which fail to deliver any commercial return.
Turning to processing, the issues emerging from the Geneva forum included the duplication of validations, the time spent on transformation and the management of and responsibility for storage. Those using external providers like RIMES save time and resources. And while others tend to trust their providers, there is not always a coherent response when problems emerge. This again brings the focus to bear on ways of centralizing and streamlining these processes. A common scenario sees business managers and analysts validating the data they use while data management teams verify the files and data coming in. It is evident that consolidating these activities through automation will bring efficiencies, but this demands a change in the attitude of those using the data and a commitment from management to implement change.
Insofar as RIMES validates according to the methodology of the providers, this not only brings value-added in monthly balancing, it also provides valuable information, including information about differential activity associated with an index.
Storage presents its own challenges, not least in understanding the contractual terms that determine use and distribution and the fact that a benchmark isn’t a purchased commodity but a rented one that is time restricted. Firms are seeking to centralize and rationalize storage to protect against compliance irregularities. They need to ensure that they understand who is using the data, how and why it is accessed and whether or not it is being accessed or distributed by third parties not covered under a contract.
‘They see the data, they think they can use it, and they download it with desktop tools and store it on their disks. Then they find it’s not so great, so they create an access base and share it with someone else on the network, and so on and so forth. It’s what we’re trying to fight against.’
Looking ahead, it is clear that a number of firms have projects and strategies to address more coherently data governance issues. Even firms increasing their sources and the number of indexes used are more focused on governance in order to contain and control access. Others are introducing quasi-legal disclaimers to make users aware, when accessing or distributing data, of the contractual obligations associated with it. This in turn reminds people of their individual responsibilities with respect to compliance. However, in some cases governance controls simply impose access criteria without necessarily indicating why an individual is barred from accessing information. This tends to be the case where the catalyst for action is primarily cost as opposed to compliance or, more generally, governance.
Overall, firms are being challenged by a growing need to reduce costs and at the same time to increase data supply and data quality – to do more with less. Index data and benchmark data are characteristically different from other data and cannot always be processed by IT, where different teams or functions approach the same data from a variety of perspectives, whether commercial or contractual, business or market.
‘ . . there is a kind of separation between the constraints of data managers and business constraints and their own priorities, both in terms of cost and data validation and data quality.’
Therefore, the implications arising from an increase in the quantity and complexity of data in an increasingly restrictive environment places greater importance on the way firms manage data: their ability to control and monitor; their readiness to comply and the degree to which governance is institutionalized. And while these trends and activities place a burden of cost on firms, a failure to integrate, rationalize and centralize data governance creates inefficiencies and, consequently, hidden costs.
Firms have to take a corporate and not a departmental approach towards effective control of both governance and processing, addressing at a strategic level all aspects of data management. RIMES is able to articulate clear responsibilities with respect to data acquisition, use and decommissioning and assist in aspects of good governance. Its Performance Team is dedicated to controlling data based on benchmark compositions and re-aggregates all performances, the official index level and changes, conducting more than one thousand rollup checks daily. RIMES helps firms to manage all the problems associated with data, providing targeted notifications and improving daily workflow. The RIMES BDS® monitor checks the status of batches and is a source of information and monitoring; its reports provide daily statistics and tables to facilitate the activities within firms.
The forum closed with an appeal to firms to anticipate future market conditions by addressing compliance issues beforehand.