Skip to content
Kezdőlap » Data quality wager

Data quality wager

Gordon Hamilton emailed Jim Harris (the author of the original article) with the following to his data quality blog post:

“It always seems crazy to me that few executives base their ‘corporate wagers’ on the statistical research touted by data quality authors such as Tom Redman, Jack Olson and Larry English that shows that 15-45% of the operating expense of virtually all organizations is WASTED due to data quality issues.

So, if every organization is leaving 15-45% on the table each year, why don’t they do something about it? Philip Crosby says that quality is free, so why do the executives allow the waste to go on and on and on? It seems that if the shareholders think about the Data Quality Wager, they might wonder why their executives are wasting their shares’ value. A substantial portion of that 15-45% could all go to the bottom line without a capital investment.

I’m maybe sounding a little vitriolic because I’ve been re-reading Deming’s Out of the Crisis and he has a low regard for North American industry because they won’t move beyond their short-term goals to build a quality organization, let alone implement Deming’s 14 principles or Larry English’s paraphrasing of them in a data quality context.”

The reference to data quality reception in Hamilton’s email was a reference to Pascal’s wager, and now to Harris’s presentation in a data quality context:

Investment in data quality initiatives is often conceptualised using non-business language and is seen as a short-term cost rather than a long-term investment. Increased short-term costs of an initiative may include the costs associated with the purchase and maintenance of data quality software, as well as the professional services required for installation, configuration, application development, testing, and training and consulting for implementation. In addition, there are often other short-term increased costs, both internal and external.

Harris points out that he is talking about proactive investment costs before any data quality issues arise that require investment in a data cleansing project. Although the short-term costs are the same in both cases, it is much easier to get support for a reactive project than a proactive one – whatever the type of project. An organisation needs to assess the potential outcomes of proactive investment in data quality initiatives, while also taking into account the potential existence of data quality problems (i.e., the existence of tangible data quality issues that affect business operations):

 data quality issuesno data quality issues
proactive investments#1. lower risks and (eventually) lower costs#2. higher costs
no proactive investments#3. higher risks and (eventually) higher costs#4. no higher costs or risks

Data quality experts, vendors, industry analysts all strongly support option #1 – and all strongly criticize option #3. (Furthermore, because they believe that data quality problems do exist, most “orthodox” experts also refuse to acknowledge options #2 and #4.)

Unfortunately, by supporting #1, we often fail to effectively sell the business benefits of data quality, and when we criticize #3, we focus too much on the negative aspect of not investing in data quality – no one will believe that such an unpleasant situation could really occur.

Only option #4 guarantees no increased costs and risk if we do not invest – since we do not believe that data quality problems will arise – and many companies are betting on that.

Source: QCDQ