Not all data is good data. In fact, one study involving 75 executives found that only 3 percent of all company data met minimum quality standards. In an article for Harvard Business Review, Tadhg Nagle, Thomas C. Redman, and David Sammon share more results of this study, and they offer a useful way of measuring data quality in your business.
Time to Lose Confidence
The authors use what they call the “Friday afternoon measurement” (FAM) method:
The method is widely applicable and relatively simple: We instruct managers to assemble 10-15 critical data attributes for the last 100 units of work completed by their departments — essentially 100 data records. Managers and their teams work through each record, marking obvious errors. They then count up the total of error-free records. This number, which can range from 0 to 100, represents the percent of data created correctly — their Data Quality (DQ) Score. It can also be interpreted as the fraction of time the work is done properly, the first time.
The authors say that many executives are mortified at the results they receive from this process, which suggests we too are likely overconfident in the quality of our data. For instance, in the aforementioned study, 47 percent of newly-created data records had “at least one critical (e.g., work-impacting) error.” And here is another piece of simultaneously good and bad news: Data problems seem to evenly plague all sectors. Everyone is susceptible to keeping awful data.
Keeping in mind that work is exponentially cheaper when driven by good data as opposed to bad data, it stands to reason that businesses are leaving an incredible amount of money on the table by not cleaning up their data collection. So if you want to save money and make better-informed decisions, it is time to take a fresh look at your company’s data.
You can view the original article here: https://hbr.org/2017/09/sgc-publish-the-week-of-911-new-research-only-3-of-companies-have-acceptable-quality-data