Organisations are now looking to “monetise” their data assets in more explicit and measurable ways in order to prioritise where investment will drive greatest returns, reduce risk and minimise loss. Previously, I outlined a simple method of four measurable tasks which enable businesses to calculate the return on investment and make objective decisions based on those findings.
To recap, the steps were: 1. Establish a monetary value or business priority for data issues, 2. Define the cost of fixing data issues, 3. Estimate how much it will cost to fix ”broken” processes and 4. Calculate ROI for each data issue. To show how this can work in practise here’s a fictitious illustration based on a common scenario and some real examples I’ve witnessed.
By attempting to reconcile the figures in two different systems, an organisation discovered that the value of goods and services being provided did not match the value being invoiced to customers. Analysis of the missing invoices revealed that all the issues were related to a particular product, but were otherwise spread evenly across business teams throughout the year, suggesting a systematic problem which the IT department duly investigated. To complicate matters, the organisation could not issue invoices more than two months after the delivery of their product, with such amounts being written-off.
The revenue loss was calculated as approximately £600,000 annually, or £50,000 per month.
The IT Department proposed a fix to the root cause in the computer system to prevent future missing invoices, costing £30,000 and taking one month. One month after starting the work, there would therefore be no new missing invoices.
This £30,000 IT cost to avoid £600,000 in missed billing annually was approved immediately.
The IT department also proposed a project to fix all the data, costing £40,000 and taking two months to implement. During the first month of the project, while IT worked to fix the root cause, another £50,000 of missing invoices would be created. By the time it was implemented the IT project would have been able to avoid losing this new £50,000 of invoicing, but the project would finish too late to recover the missing revenue prior to that. This amounts to a net gain of £10,000.
Alternatively, the business proposed to immediately use a task force to manually fix the data and issue invoices for products delivered within the previous two months right up to when the underlying problem in the computer system could be fixed; three months in total. The cost of the task force was estimated at £60,000, and they realised they would avoid losing three months of invoicing, £150,000, i.e. a net gain of £90,000.
Getting the business task force to fix some of the data was £80,000 more cost-effective than the IT data correction project, so that approach was chosen.
The “old” data for which invoices had not been issued was never fixed because it was not cost-effective to do so, but the Data Quality team had identified and quantified the issue. The organisation was able to justify the accounting discrepancy, demonstrate perfect knowledge of the associated data quality, and prove to auditors and regulators the presence of an effective Data Governance process. They were also able to add some specific new data quality checks into their Governance processes to provide proactive alerting of any such issues in the future.
Enabling The Experts Is Key
The key to success for the projects I have seen is the enablement of subject matter experts through the self-service use of software capabilities. The subject matter experts use a combination of the following capabilities – proactive discovery of issues, interactive ad-hoc analysis (data quality analytics), issue management, data rules development (transformation, validation, measurement, correction and enrichment) and reporting. The addition of automation and alerting capabilities brings reliability and consistency to the process and the appropriate sense of business urgency to each of the measurements.
The monetisation of data assets can improve the financial performance of all organisations and a structured, self-service approach to the management of data quality, controlled by the subject matter experts themselves has become a realistic, measurable way of achieving this.