Mark Gunning

Mark Gunning is global business solutions director at Temenos

Let’s be clear: banks do a very difficult job – they store the value of society expressed as money. We trust them and they can’t get it wrong, but they are nothing but people and IT. Everything they own is on computer and they don’t like to take risks with this.

Consequently, IT change for banks has been slow and safe. It has been incremental: bit by bit, byte by byte. Banks’ systems, at the cutting edge in the 1960s and 1970s, have been upgraded, patched and reworked to their very limit, leaving their owners with IT architectures, kit and procedures that are complex, expensive, inefficient and vulnerable, writes Mark Gunning.

In everything that banks do, they seek to minimise risk, and IT change has always involved risk. Basically, banks hate change because the risk/reward calculation is hardwired into their code.

The problem is that the systems themselves now pose a risk to service, reputation, profitability and even survival. They need to be replaced to make banks fit for banking in the 21st century. In a recent Temenos poll, a startling 80 per cent of respondents agreed that “Ageing IT is the biggest threat to banks today”.

Here’s a great example – I have heard that one UK bank is still using a system based in pounds, shillings and pence, with the front end doing lots of conversion. Even if this is not exactly true, certainly most of the UK major high street banks have their core banking systems dating back to the 1970s.

This can mean that when a customer looks at her bank balance on her phone she is going through layer upon layer of IT to get to that number – a complexity that is leaving banks open to outages, inefficiencies and making it much harder and more expensive to comply with regulation.

When I first started at Temenos in the 1990s our systems had to interface with an average of 20 other systems; now they have to interface on average with more than 60. And the situation is becoming more complex every day. Indeed, some 15 years ago, when JP Morgan was doing its Y2K audit it found it had more than 3,000 applications. This would not have been unusual, so it is easy to imagine the massive increase in the number of systems and interfaces banks are dealing with today.

Strategically, this is interfering with progress. Maintaining existing legacy systems, which most developed market banks still run, consumes three-quarters of IT budgets, crowding-out investment in IT enhancements.

Sometimes these legacy systems are managed by staff about to retire. After an IT audit, one Asian bank found that a significant portion of its IT estate was entirely maintained by people over 60. It had a choice – try to find a younger team who were willing to train in outdated technology or replace everything with a system fit for purpose. It went for the replacement.

There is also the issue of outages – both in terms of reputation and cost. A City analyst once said that if a bank suffered a major outage it would take a year for its share price and investor confidence to recover. If it suffered a second, confidence would degrade to the point where the bank would need to be sold. If RBS were not owned by the government, I wonder what could have happened to it. (Update: the UK government began to sell its stake in RBS in July 2015 – earlier than most forecasters predicted.)

Finally, there is the question of new products and profitability. Some years ago one country changed its tax regulation overnight. Of the five banks in the market, three had archaic software and took three months to get a new, relevant product to market; two others had modern modular software, including one with a Temenos system. They both got their new products launched within two days of the regulation change, capturing 90% of the market.

Banks’ aging IT environment is inappropriate to today’s banking challenges. They need real time operations and they are still insufficiently customer oriented – their systems are still based around accounts and products, not people.

For example, most major high street banks don’t have a single customer view. They can’t tell which customers are in credit in their savings account and running an overdraft in their current account, or see those customers’ insurance or credit card situation at the same time. They have been blindsided by new entrants such as Apple, Google, PayPal and even Tesco, which know exactly what each customer buys from them across a whole range of areas.

Banks really struggle to see who is high risk, who is profitable. And they have no transparency within their operations. Take buying a light bulb on Amazon. It’s a £2 purchase but you can track it from the warehouse to your door – if you buy a mortgage you’ve no idea where the application is in the process unless you ring someone who puts you in touch with someone who might be able to tell you when you’ll get a decision.

Banks are also facing a hugely increased regulatory burden in the wake of the financial crisis. The post-2008 era requires banks to slice and dice information and data in a myriad of ways to help to analyse risk – and their systems can only do this with a lot of extremely costly input from staff.

Banks can see this and they are alive to how these changes are affecting profitability. When all the traditional banks were in the same boat, there was no real pressure to change: everyone had occasional system outages that stopped customers seeing their bank balance from an ATM; the banks were profitable enough to cope with IT eating up 15% of their total costs; and they were able to adapt their systems to deliver some new services.

But with forward-thinking banks already moving to new systems, growing demands from regulators and consumers, and intense competition from new entrants, there is a brewing fight for survival.

Once a critical mass of banks has transformed its IT, the rest will have to follow, or die.

@banking
techno