If you’ve heard it once, you’ve likely heard it a thousand times; if you snooze, you lose. And for the likes of Financial Services, keeping up with data volumes and demand is paramount to their success… or demise.
As the market leader in financial data quality solutions, we help clients simplify complexity and ensure users across the buy and sell-side make the most of their data assets by providing easy data integration, data cleansing, distribution, and data discovery solutions.
Catch up on the full series

Can’t see the wood for the GREEN trees!
90% of all the data in the world has been produced only in the last three years. So, one might naively suspect that we have all the information we need.

For information to be reliable, we should be able to trace: what attribute changed, when exactly this happened, and who did it, and even – this is where is gets really interesting for the auditors – the reason why we changed it?

Buying the same data multiple times
As the demand for data grows and increasingly more types of data become available the costs and risks inevitably arise. For most financial institutions the pressure has been building up to reduce costs and improve efficiency by optimizing cycle times.

One area which is not often the primary focus or driver of improvement initiatives is that of tracking the metadata surrounding basic financial information such as issuer data, corporate actions, terms and conditions, and, above all, market data.

Achieving and keeping data quality from one-off to a continuous process
Exploring the immense value in developing and maintaining a Data Quality framework. One that clearly outlines policies for managing data quality and defining what metadata is key.

There are different paradoxes in data management. One is that, quite often, firms have multiple different “master” databases for their price data, their customer data and the terms and conditions of the products they invest in, trade or issue.

Sometimes there is no suitable model or the right data might not be readily at hand (yet), which prompts one to resort to proxying. Here one wants to tread even more carefully to avoid creating additional model risk.
