Financial institutions need to take full control of their vast number of spreadsheets and databases if they want to be fit for the new era of post-crisis risk management. If they are not prepared, firms face a growing threat of regulatory censure over weak data management systems.
Both the Switzerland-based Basel Committee on Banking Supervision (BCBS) and the Financial Services Authority (FSA) in the UK have recently made it clear that when relying on manual processes, desktop applications or key internal data flow systems such as spreadsheets, banks and insurers should have effective controls in place that are consistently applied to manage risks around incorrect, false or even fraudulent data.
The citation by the BCBS is the first time that spreadsheet management has ever been specifically referenced at such a high level, a watermark in the approach to spreadsheet risk.
The failure of businesses to fully understand, control and monitor data held in spreadsheets leaves businesses worryingly exposed to unacceptable risks and recent indicators from the regulators now suggest that firm action will be taken against those that bury their head in the sand.
The most high profile indication of a shift in focus came from the Basel Committee in its consultation paper called ‘Principles for effective risk data aggregation and risk reporting’. The committee clearly stated that banks now need to take the issue of spreadsheet controls very seriously.
The simple fact is that spreadsheets continue to be used extensively by all financial institutions and remain at the heart of many firms’ data systems. We know that many institutions have to contend with thousands, if not millions of spreadsheets.
It is unsurprising that regulators are now openly pushing that without effective data management controls in place to manage spreadsheet estates, insurers and banks are leaving themselves dangerously exposed to significant but avoidable business risks.
In September, the FSA addressed the issue of spreadsheet risks in its report Solvency II: Internal Model Approval Process Data Review Findings. The regulator stated that insurance firms will be expected to demonstrate appropriate controls with regard to key internal data flow systems such as spreadsheets. These controls should take the form of, among others, input validations, change and release management, disaster recovery and documentation.
Research carried out by my company earlier this year found that over three quarters (77%) of senior level insurance executives would welcome more specific guidance on best-practice use of the spreadsheets in the run-up to Solvency II.
We also found that only one in five (20%) actuaries and insurance finance professionals are completely confident of their firm’s data control processes and the supervision of their large estates of business-critical spreadsheets.
It would appear that the defence of misunderstanding, ignorance or denial will no longer be deemed acceptable by regulators. Ensuring good quality data management is fundamental to the success of businesses, and in the current business environment, this is more important than ever. In an era where ‘Big Data’ is being increasingly cited as a major risk, demands from regulators and other stakeholders around data management will only continue to intensify.