RIMES was recently asked by Buy-Side Technology to submit editorial content looking at the challenges involved with supplying quality portfolio data. The Buy-Side Technology Guide covers technology and services for the traditional and alternative asset management industries, and is published by Incisive Media. What follows is the full article published in the Guide:
Validated benchmark data is a key contributor to successful performance attribution
With the technological advances made each year the performance team now finds that they are inundated with innovative solutions from various vendors of performance attribution systems. These systems provide choices of models to cover multi-asset class attribution, enabling the performance team to report on and replicate the investment process. A key component of any performance system is of course the data that is passed through the calculation engines, both the portfolio and benchmark data. However no matter how sophisticated a performance attribution system is at producing reports, these are only as good as the data inputted. Get the data component wrong and the reports will become inaccurate and distort the analysis of the actual investment decisions that have been taken.
With increasing emphasis on GIPS 2010, the performance team is now required to report both internally and externally more often than previously and end users have become better informed and consequently more demanding. The performance reports are obviously dependent on data and this is the challenge that the performance team grapples with on a daily basis. As a department they probably demand the highest level of data quality within an asset management firm.
The performance team relies heavily on the quality of the portfolio data that is generated by the accounting system and this will drive the decision on whether to use a buy and hold or transactions based approach to attribution, assuming that the system can handle both approaches. The variety of instruments available to a fund manager including stocks, bonds, derivatives and alternative investments result in increasingly multifaceted portfolios, which add to the complexity of tracking the performance of a fund on a daily basis.
The data required for equity models is widely available and the attention has been on fixed income attribution. The data requirements to run fixed income models are much larger and go beyond the approach of weights and returns in the equity models. Indeed the fixed income models require a greater number of attributes to enrich the bond and derivative data within the portfolio and benchmark to ensure that they can decompose and explain the source of the excess return or the basket of risk.
The portfolio data needs to be loaded into the attribution system and this may require a little or a lot of data manipulation depending on the technical requirements of the system. Information may be produced by one system for the equity and balanced funds, whilst the fixed income funds accounting information may be generated elsewhere. If the performance system is flexible then the role of the performance team and the role of the I.T. resource involved in the process become much simpler. Barra Performance on RIMES can handle portfolios in almost any format through smart loaders to ease the data management burden on the client.
Loading the portfolio data is only half of the data exercise and sourcing the benchmark data is often overlooked. Does the attribution system come with benchmark data preloaded or does the performance team need to source the benchmark data. Most systems are standalone applications and the benchmark data needs to be sourced separately. In that case is it better to go direct to each index data provider or to outsource to a specialist data aggregator capable of matching the exact requirements of any performance attribution system, like RIMES.
For the attribution systems that include data, the question becomes how good is the data quality assurance in the system and their coverage across all asset classes? Are the systems capable of adding new data sources and indices quickly and accurately?
Whichever route you choose it is important to have the ability to keep pace with the fund managers changing requirements and desire to meet new mandates. How quickly can the I.T. department or data integrator add new indices to enable performance team to produce reports when the users need them, fast and often?
The demand for official custom benchmarks is on the increase to meet the evolving investment strategies and to facilitate the need for greater transparency. Can the system handle custom, blended or capped indices in the same way as a normal benchmark?
Once the data has been sourced then comes the question of transparency. Can we actually replicate the index composition; can we match the performance with the underlying constituents to an acceptable level of tolerance? In the equity world we need to have adjustment factors and dividend information to match total return indices, but this is only the beginning. Index rebalancing needs to be captured, whilst different index weight methodologies and total return calculations need to be accommodated.
Delivery of data will also vary; some equity families calculate and disseminate benchmarks on a 7 day a week basis (Gulf Cooperation Countries), others on a 5 day a week cycle. In the fixed income world several providers calculate their indices on a weekend when the month-end falls on a weekend, others do not. Having the knowledge of these different nuances and rules is essential to ensure accurate replication.
Once the data is loaded then quality becomes the focus for the performance team. It is paramount that the performance team identifies data quality issues as early as possible to minimise the time spent on investigations. Where in the process do problems get uncovered? Does the data vendor spot the issue, or is it captured by the data warehouse or middleware system or actually the performance system itself? The advantage of a solution such as Barra Performance on RIMES is that it comes preloaded with benchmark data. The data validation has already taken place and you can be confident that the benchmark data will be correct and accurate to a high level of precision.
Understanding and identifying data quality failures can be extremely time consuming for the performance team and this is where outsourcing the data and quality management to a data aggregator brings considerable economies of scale and efficiency.
Performance is all about the data, if you streamline the delivery of quality data into the performance system, you will improve efficiency and scalability; shorten the daily and monthly reporting cycles; and ultimately maximise the return on investment. Thus enabling the team to focus exclusively on the value-add tasks of the performance attribution process.
- Navigating Fixed Income Analytics in a POINTless World
- MAR Update – Regulatory Oversight is on the Rise
- Ensuring Regulatory Compliance Includes Detecting Questionable Order Activity
- Ask the Expert: FIGI I.D. Mapping Across Multiple Asset Classes
- More Time Required Before Phasing Out of Key Reference Rates