Volatile Times for Data Quality

Guest Contributor: Brian Sentance, CEO,  Xenomorph

Regardless of whether we now think the 2007-8 crisis ever ended, regulators, clients and practitioners have all become much more focused on data and data quality since the financial crisis began. Making decisions based on inaccurate, incomplete or out-of-date data was always a bad idea, but I would suggest it is one of those issues that only has noticeable impact when markets are not rising, capital is not so readily available and clients are a good deal more patient. The quote: “Only when the tide goes out do you discover who’s been swimming naked.” by a certain Mr. Buffet applies to so many different aspects of the financial crisis, and data quality is certainly one of them.

Many institutions are putting a lot of effort into improving the quality of the data they use and indeed for many this effort is paying off with more accurate, timely data at lower cost. However this is not a small undertaking for many institutions, and like many people faced with a pressing problem they take the approach of doing more of what they have always done, rather than trying to see if the problem size requires a change of approach.

One prime example of this was at a recent data management conference I attended, where a speaker from one financial institution was describing (and praising) the very long hours and much appreciated hard work his data management team had put into resolving the many thousands of price exceptions resulting from the volatile market conditions. A price exception was defined as any price where its movement since the close of previous trading day was larger than a given percentage.  As a result of the higher levels of market volatility during the crisis, the number of price exceptions to be validated had risen from several hundred to several thousand daily, and workloads (and stress levels!) had risen accordingly.

It is obvious that market volatility will produce more errors and exceptions that need to be managed, the more things move, the harder they are for any human being to tie down. That said, there is no need for market volatility to necessarily translate directly into increased workloads and reduced data quality. How? Well instead of having your data exceptions defined relative just to the price yesterday, why not adjust the exception level for the level of volatility experienced in the market? If the index dropped by 4% in a day, do you really want your staff chasing down thousands of “false positives” that have exceeded “normal” movement thresholds set at say 2%? To make matters worse, the price errors that you actually want to detect are probably more serious than usual but get lost in a sea of exceptions to be processed and validated.

This suggested change in the way price exceptions can be calculated is just a simple example of how some data management processes should adapt to changing and more volatile times, and there are many small changes that could be made to improve efficiency and keep data quality high. Just because something has always been done a particular way, doesn’t mean that a little time catching our breath and rethinking a problem through is a wasted endeavor.  Volatile times sometimes call for a good deal of patience.

This entry was posted in Back-Office, Guest Blog. Bookmark the permalink.

One Response to Volatile Times for Data Quality

  1. Sydney says:

    I’ve also attended a data quality conference recently and learnt a lot I didn’t know. Very good post, now I know even more 🙂 Thanks for sharing

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s