It’s now nearly a year since the SEC voted to adopt Rule 6c-11, otherwise known as the ‘ETF Rule’. The rule has made it easier for firms to issue ETFs and to customize creation and redemption baskets with authorized participants (APs).
Since this landmark event, there have been a large number of file format changes initiated by ETF issuers. This has presented something of challenge for consumers of ETF data, who have had to spend a lot of time and resource adopting these new changes. Often the end benefits of these format changes have remained unclear.
What’s important is that firms are not only able to stay in sync with any format changes, but are also able to analyze what the change means in terms of data quality.
John Lanaro, Global Head of ETFs at RIMES, provides some advice: “The ETF market is exploding, and the sheer volume of format changes firms must process can be overwhelming, especially when firms don’t fully appreciate the reasoning behind the changes.
“In our work with ETF issuers, we’ve identified two key tests to run that show the impact on data quality. We recently applied these tests to J.P. Morgan’s overhaul of its Basket Holdings file structure and the results were impressive.
“The first test looks at the percentage of cash represented in the files before and after the format change. With J.P. Morgan, we immediately noticed a significant drop in the percentage of cash in the new file format, which indicates that the new feed is a truer representation of what the portfolio actually holds:
“The second – and most important – test looks at the accuracy with which the new files arrive at the official end of day NAV. Our analysis of J.P. Morgan indicated that their new format allowed users to arrive at the NAV much more accurately:
“By working closely with issuers to understand the changes they made to their formats, firms can gain a good understanding of relative data quality. The challenge is doing this at scale across numerous ETF providers and in an environment where change is constant. Here, working with managed service providers like RIMES, that already work directly with ETF issues and can manage quality assurance, comes into its own. It’s an approach that allows firms to rest assured that their data quality is in hand, while saving their time and resources to focus on core business activities.”
Contact RIMES to find out how we can help you streamline ETF data capture, eliminate costly internal processes and enable daily exposure insight, aggregation and reporting.
The content provided in these articles is intended solely for general information purposes, and is provided with the understanding that the authors and publishers are not herein engaged in rendering regulatory or other professional advice or services. Consequently, any use of this information should be done only in consultation with qualified legal counsel. The information in these articles was posted with reasonable care and attention. However, it is possible that some information in these articles is incomplete, incorrect, or inapplicable to particular circumstances or conditions. We do not accept liability for direct or indirect losses resulting from using, relying or acting upon information in these articles.
- The Data Management Model is Broken. Here’s How to Fix it.
- RIMES Creates Lean Data Management Solution Transforming How Financial Institutions Approach Enterprise Data
- What’s the BUZZ? Get Under the Skin of an Exciting New ETF
- SFDR is Now in Force. Are You Ready for the Data Challenge?
- RIMES’ Panel Debate: Equity for Women and Driving Inclusion in the Workplace