For finance directors, new regulations have been a major preoccupation for nearly a decade, each one causing a shift in how financial risk should be viewed. These rules have often taken so long to formulate that it seems that implementation might never happen. As a result, there’s been a lot of uncertainty over how much capital and time to put into compliance for fear the measures will be watered down.
At the same time, technological development has spurred the emergence of new types of financial risk, most notably cybercrime, and has allowed other sources of risk to proliferate much more quickly – in turn driving the creation of ever more regulation.
This all adds up to the corporates, and their finance and risk teams, having to chase ever-moving goalpost, a challenge compounded by public scrutiny; for business-to-consumer vendors in particular, the public’s tolerance for events that at least appear to be the result of ineptitude or negligence has never been lower.
While the age-old techniques such as regression analysis, value at risk measurements and scenario testing will always be the bedrock of financial risk mitigation, new techniques, technologies and processes are being developed to improve their efficacy.
Perhaps the dominant consideration over the past decade has been liquidity risk, particularly in the form of the Basel regulations. The chief component of the regulations called for banks to hold more capital of better quality and build up capital buffers against possible future losses – both requirements which banks have largely fulfilled.
Other policy prescriptions, however, have yet to fully take root. A major challenge is fulfilling the enhanced disclosure proposal published in March 2016, which requires banks to provide much more granular data about the risks on their books.
Finance directors will know that most asset management and risk liability systems are not set up to do this. Some systems report weekly or even monthly but certainly not daily as is required by these regulations. They are also rigid, pre-programmed, not capable of adjusting quickly to new scenarios or queries.
Improvements in big data analytics are already starting to make a difference. Instead of trying to encapsulate the risk profile of a million loans in a compressed sample of 100,000, as current systems are designed to do, the superior analytical power of big-data engines allow them to examine every asset down to a granular level, giving a more accurate risk picture at no cost to speed.
“Big data allows us to bring as much data as you can think of, tens of millions of records into a risk system,” says Luis Matias, ALM and liquidity risk lead at IBM Watson Financial Services. “We’ve proved we can produce those metrics about 15-20 times faster for risk managers that demand them,” Matias adds.
The same techniques are also proving affective in measuring credit risk. Up until now, models place borrowers into broad categories based on payment history, demographics and the amount of money in their bank account. Being able to analyse larger amounts of data gives banks the ability to ascribe more specific risk characteristics to a borrower and make fewer assumptions.
Combined with machine-learning techniques, which mean the risk analysis process becomes refined with each interrogation of a data set, you have a very powerful tool. “[Machine-learning] can determine if someone is of high or low risk by scouring through variables and values of relationships, co-factors, interactions, dependencies, associations, and more,” Mike Blalock, general manager for Intel’s Financial Services Industry divison. “[It] even has the potential to pick winning stocks and investments by studying earnings statements, news reports, and regulatory filings in search of clues.”
In recent years, accounting risk has almost trumped liquidity risk as a source of concern to the bank finance function, in part because it represents a rare example of policy being introduced on time and unchanged.
As of 1 January 2018, the IFRS 9 rules came into force across the 120 constituencies that abide by International Financial Reporting Standards, replacing IAS 39. At the heart of the new standard is a revised credit loss model that will change the way banks report losses, forcing them to account for potential losses when a loan is issued, not just when the loanee gets into trouble.
“On the first day you lend money, you don’t have to book the entire expected loss, but a portion,” explains Sue Lloyd, vice-chair of the International Accounting Standards Board, which is responsible for setting IFRS standards. “If I lend money to someone and it turns out the risk was worse than I thought, once I decide that a loan is underperforming relative to my initial expectations I book the lifetime value.”
According to Nitin Iyer, head of risk business solutions and pre-sales, Europe, at technology company Finastra (a result of the merger of Misys and D+H), a small number of banks have gone from just complying with the regulations to a level in which IFRS 9 compliance is baked into daily decision making.
First they automated the process of booking loans, given real-time visibility to where charges are being booked and allowing them to rebalance their portfolios accordingly. Now some banks are starting to calculate the likely cost of accounting related charges before a loan has even been sold, allowing front office staff to dynamically adjust how they price their products.
At the heart of this has been increased collaboration between the finance and risk departments at banks: sharing information and harmonising the way they monitor and report risk.
“You might pass the extra cost onto the customer, which could make you uncompetitive, so you may have to rebalance your portfolio,” Iyer tells Financial Director. “But it gives you a view of what your cost would be before you commit the transaction. It’s gone from being a day 1, tactical view to ‘how do I use the spirit of compliance and try to make the business more competitive?’”.
The cyber threat
Cyber threat is a risk consistently ranking at the top of the list of things keeping finance directors awake at night, and shows no sign of going abating. According to research by Cybint, a firm that provides customised cyber security educational tools for businesses, the average cost of a data breach in 2020 is likely to exceed $150m, as more business infrastructure connects to the internet.
The need for top-notch data protection is heightened by the EU’s GDPR introduced last month. The penalty for non-compliance is a fine equivalent to 4% of worldwide turnover or €20m- whichever is higher – so banks must do all they can to show what data they have on file, what it’s being used for and that it’s as safe as possible.
The hackers are undeniably on top here. The aim of the defender is to wear the hacker down with multifactor authentication, firewalls, malware detection and anomaly detection software. In truth, however, anyone with the technology, skill and inclination can hack a computer system.
“It’s not clear that sophisticated cyber defences can keep up on systems that are exposed to the internet,” a cybersecurity specialist with a US research institute told this reporter last year.
But even here, the ability of banks to respond to threats has been bolstered by new ideas and processes. Scenario testing has long been used to to mitigate credit, market and other types of financial risk. But growing numbers of companies are seeing war games as the best way to prepare for the fall-out of a cyber attack.
At their most demanding, a war game can last a whole working day and involve a “live” cyber attacker reacting to the actions of the team being tested. As well as helping firms understand the robustness of their cyber defence processes, such tests invariably lead to the emergence of questions and problems that had not even been considered.
“Like scenario planning and assumption testing, war-gaming provides a means to think outside of the conventional mental odes to discover threats and opportunities of strategic choices, and allows leaders to see the potential second- and third-order effects of their decisions,” wrote consultancy firm Deloitte in a 2017 report.
While the range and severity of risk continues to grow, the financial risk mitigation techniques available and the technology that powers them mean there’s no need to despair just yet.