Buy-side financial institutions should be aware of the many challenges that can exist in terms of data integrity.
Companies can easily face difficulties as a result of errors in their key information. Firms that find that some of their data is not high-quality might not want to take the time to go back, review the calculations they used to arrive at the current figures and then determine their data points once again.
Market expert stresses due diligence
Financial institutions that want to be proactive might benefit from performing the proper due diligence on their existing information to determine its quality, Stefan Groschupf, the chief executive of a San Mateo, Calif.-based software development company, wrote in a VentureBeat piece.
The author noted that amid all the hype that surrounds big data, many have grown less concerned with the quality of their important information. Groschupf emphasized that in order to maximize the value of this crucial resource, companies should review data integrity at every point in their workflow.
The key role that quality plays in the usefulness of information was also highlighted at an industry webcast that was held on March 6, according to Inside Reference Data. Adam Cottingham, vice president of data management services for a global software and managed services firm, stated at the event that companies should continually strive to bolster the quality of their data.
The market expert noted the interplay between business operations and important data, stating that if companies revise their current processes to fix data challenges, doing so can run them into trouble, the media outlet reported. He noted that companies need to watch out for changes that operational staff can make to the crucial info of a company.
Data errors can create problems in many places
Any such alterations can potentially create problems at many points in an organization, Groschupf wrote. The author stated that in order to avoid this problem, companies should evaluate data integrity at every stage of the analytics workflow.
The need to constantly assess the quality of this information may be more striking than some would think, as roughly half the participants in a recent industry survey noted their concerns about data integrity. In addition, most of the IT executives who took part in the poll stated their organizations have no system in place to assign responsibility for such matters.
- RIMES Lists Its Managed Data Services on Datarade Data Marketplace to Meet Surge in Global Demand for ETF Intelligence
- Fitch Ratings ESG Relevance Scores Data now available on RIMES
- RIMES’ transformational new Lean Data Management solution wins at the Waters Rankings and HFM European Technology Awards
- How Chief Data Officers Succeed in a Data-Driven Age
- What is Agile? Q&A with Andrew Barnett, Global Head of Product Strategy, RIMES Technologies