
Paul Dietrich, Collibra Collibra
By Paul Dietrich, Collibra
T his is easier said than done, however, because the volumes of data processed by financial organizations today are massive. Ensuring the accuracy and integrity of data here in a high level of detail is complex. In addition, financial transactions are time-critical, so a single data error multiplies quickly in downstream processes and is not easy to correct retrospectively.
How to measure data quality as a financial services provider?
Data quality is determined by its completeness, accuracy, validity, and compliance with various regulations. All of these dimensions must be met for data quality to be high in this process. One challenge here is that these dimensions, like data, do not always remain stable in practice. So the quality of individual data sets can drop quickly. Customer data, for example, can change over time:
If changes to addresses or phone numbers are not updated immediately, data loses its integrity as a result."
Unfortunately, when connecting new data sources and applications, harmonization is too often costly and therefore postponed. As a result, several data sources become minefields at once. Even if these small glitches seem insignificant, they directly impact customer experiences, interactions, and transactions, resulting in additional costs and lost revenue.
So how can financial service providers improve data quality?
Improving data quality fundamentally requires a comprehensive improvement program: this includes creating a data culture, a deeper understanding of one’s data, continuous metadata maintenance, awareness of the most common quality issues, and the targeted use of technology.
Compliance is an area where data quality and data governance are closely intertwined. Continuous data quality monitoring supports both aspects here, giving organizations deeper insight and better analytics based on more accurate data.
Predictive and continuous data quality streamlines time-sensitive processes and delivers more reliable results in real time."
Here are some examples of how financial services organizations are deriving business value from predictive data quality measurement:
1. Constant monitoring of exchange rates
Around the world, foreign exchange transactions are conducted across more than 28,000 currency pairs that are constantly changing. Most banks work with a targeted list of quotes that they combine with other financial data to perform analysis.
Quality control of such a large dataset can be tedious, requiring hundreds of manual rules to detect duplicates, anomalies, or correlations.
Predictive data quality can instead automatically warn of incorrect exchange rate data."
With the ML-based approach, quality tests are performed to deliver optimal and consistent controls across all data sets. These predictive analytics also continuously review histograms and segmentations, as well as developing patterns.
2. Track intraday positions ahead of time
Financial organizations process large volumes of intraday position data in near real-time. Tracking them is complex, especially to ensure that the correlation is correct and that there are no duplicates for each company. However, if a firm does not trade or adjust its position during the day, these missing registers may not trigger a false alarm either.
Here, predictive data quality can identify duplicates or outliers in real time and provide highly qualified data."
This can ensure that current data is trustworthy, and that representative data is incorporated into analytical models over the long term.
3. Detecting anomalies and hidden patterns in data security credentials

Paul Dietrich Has been in the analytics space for 18 years. He currently leads the Collibra team (website ) for the DACH and Nordics region. Joined Collibra in early 2019 after spending eight years at Salesforce and previously serving clients at Gartner (CEB) and BBDO Worldwide. He holds a Master of Science in International Business Economics from the City University of London. His greatest passion is helping customers use data as a common language to drive empathy, understanding and successful business outcomes. At the same time, they have a particularly high demand for quality and, above all, data quality.
Financial institutions of all types and sizes integrate reference data from various providers such as Bloomberg, Thomson Reuters, ICE Data Services or SIX Financial Information. The accuracy of this data is central to any business decision, and identifying false values early in the data collection process greatly reduces downstream complexity. On the other hand, masked patterns also affect the quality of data generated by source and reporting systems, and their early detection also reduces the effort spent to correct them.
In both cases, predictive data quality helps identify securities that violate historical patterns, reduces false positives, expands coverage and quickly models a complex set of controls.
4. Control credit risk in bank loans
One of the most important financial activities of banks is lending, which comes with its own risks. As a result, they need to be vigilant throughout the loan origination and approval process, validating data at every stage to limit credit risk.
Predictive analytics tools can be programmed to provide real-time credit ratings, SSN checks, loan-to-value determinations, interest rates, duplicate loan applications, etc. validate.
5. Track fraud and cyber anomalies in real time
Financial services organizations are handling sensitive data, leaving them vulnerable to cyber threats. Online financial transactions also expose them to potential security breaches.
As a result, financial organizations are increasingly relying on automation to detect and protect against cyberattacks."
Predictive data quality can continuously load and process various security-related data streams to identify anomalies in networks. Real-time alerts allow cybersecurity professionals to respond quickly to potential threats. Paul Dietrich, Collibra