All along the watchtower
Managing operational risk is an unglamorous part of life at every major financial institution. Yet as a string of recent court cases, fines and regulatory actions testify, it is an area that is increasingly forcing its way into company boardrooms and commanding top-level attention.
In August, Standard Chartered agreed to pay $340 million to US regulator the New York State Department of Financial Services. The bank had allegedly breached US sanctions on Iran; in response, the regulator threatened to revoke the firm’s licence. Under the settlement, the bank’s risk controls will be monitored for two years by regulatory personnel and the bank has been forced to install permanent personnel in New York to make certain that it does not breach anti money laundering (AML) legislation again.
The Standard Chartered case was notable for revealing the long reach of US regulators to punish actions taken beyond that country’s borders – a factor that has also affected other large financial institutions. Since July, HSBC has also been facing a fine of up to $1 billion for failures in its AML controls, mostly related to its Mexican operation, which resulted in the company acting as a conduit for funds to drugs gangs and terrorists.
These incidents comprise only one aspect of the operational risk that financial institutions face on a daily basis. Rogue trader events, management scandals, reputational risk, IT risk all have the potential to cause firms major difficulties. To cite just one example, broker Knight Capital suffered a technology glitch in August that left the company facing a $440 million loss. Other technological problems, such as the Facebook initial public offering at Nasdaq OMX, in which a rush of orders caused a collapse in the exchange’s trading systems, are becoming increasingly commonplace as financial technology pushes the boundaries of speed and throughput.
“We have seen some big scares about fraud lately,” says Neil Vernon, development director at Gresham Computing. “People have been unaware of how transactions are moving. Mid and back office have been left behind by the front office and unauthorised movements have taken place. We did an analysis that found 2500 places where a transaction could fall out of the STP flow.”
Rising levels of technological complexity have gone hand in hand with exponential increases in the quantities of data handled by financial institutions. Company systems and processes have struggled to keep pace, with the result that half of senior risk executives at major banks believe that capturing and processing risk information across their businesses is inadequate, according to a white paper released in September by research firm IDC Financial Insights. To make matters worse, many control checks result in problems being identified months after the original trade by the trader in the front office, according to Vernon, while some 80 per cent of controls are held in manually updated spreadsheets that are notoriously prone to errors and outdated information.
“How do we replace that kind of infrastructure with real-time controls for a T+1 minute kind of world? We need reconciliation technology,” he says. “There shouldn’t be a single point of a transaction where somebody isn’t notified of a change.”
In October, US business intelligence company QlikTech reported that capital markets firms are relying on outdated information that is inadequate for controlling risks.
Based on a study of risk managers within capital markets companies, the report found that only 5 per cent of risk managers are receiving the information needed to do their jobs in real-time. Some 38 per cent said it took more than four hours to get the latest data after the markets had changed, and 31 per cent of those questioned said they were relying on information only updated on a monthly basis to make important decisions around risk management.
The research also found that 42 per cent of the data was still presented in a static spreadsheet format, with 34 per cent unable to drill further down into the information to get detailed insights. Some 14 per cent admitted that the data held by front office personnel and risk managers was contradictory.
“It’s imperative for risk managers in the financial services industry to have access to up to date, accurate information on which to base trading decisions,” says Mike Saliter, senior director, global market development, financial services at QlikTech. “Traders need the agility to be able to change their minds frequently about what information they need to see as they react to changes in the market. At the same time, risk managers need to take a holistic view of the entire organisation, enabling them to make firm-wide decisions about risk appetite and limits.”
As regulators around the globe struggle to get to grips with the financial crisis and ensure a safer, more transparent market for financial services, more stringent criteria are being enforced in many business areas. Tougher compliance and greater rigour in areas such as examining acquisitions, new applications and systems and data protection have forced company chief executives and senior-level management to look more closely into their operational risk. The IDC Financial Insights report found that seven out of ten financial institutions surveyed cited the importance of risk infrastructure in driving strategy. The study also found that the same proportion of firms was reviewing its risk infrastructure.
“Risk is coming into the board room,” says Andrew Aziz, executive vice-president, buy-side business at risk systems specialist Algorithmics. “Within companies, risk is increasingly being recognised as a resource, just like the capital a financial institution deploys. We’re seeing risk management staff taking a much more active role within firms and risk is increasingly making its presence felt in the front office, as well as the traditional middle office. Its importance is expanding on multiple levels at once.”
Regular reviews of risk infrastructure are necessary, according to Henry Balani, head of risk compliance at BankersAccuity, because in the current regulatory environment even if a firm is compliant today, it cannot be sure that it will still be compliant in the near future. Citing the case of HSBC, in which the firm’s Mexico operation saw $7 billion in cash transferred to the US during the period 2007-2008, Balani points out that changing conditions, coupled with a company structure that allowed the bank’s local operation to assess its own risk, led to disaster.
“Seven billion dollars in cash transfers from Mexico to the US? That’s considerably more than the figures reported by other banks that were much larger in the region,” says Balani. “HSBC should have been aware that the money likely came from drug sales. It’s no excuse to say that ten to 15 years ago, Mexico was considered safer than it is today. They were too relaxed, and they didn’t update their view of that country to take account of changing realities on the ground.”
Predictably, HSBC’s Mexican branch had classed itself as ‘low risk’. So how can large financial institutions ensure that the operational risk resulting from their regional subsidiaries is kept to a minimum?
“You need to check whether there is an assessment procedure done externally to validate the process,” says Balani. “In the case of HSBC Mexico, the answer was no, and HSBC is now paying the price. It’s also worth remembering that policies transferred from a head office to an emerging markets-based subsidiary, for example, may not be enforced as tightly as the original rules. This too needs to be checked.”
However, part of the difficulty for many firms is that the local regulations they face around the globe are often lacking in cohesion and sometimes conflict with each other. The European Union’s European Market Infrastructure Regulation reforms to OTC derivatives markets are not coordinated with the US Dodd-Frank legislation on the same topic. Hong Kong and Singapore are following their own versions of the G20 requirements on reducing systemic risk in financial markets. That can make it difficult to adopt a holistic approach to operational risk in multiple jurisdictions.
“There isn’t a coherent plan,” says Tony Freeman, executive director, industry relations at post-trade services firm Omgeo. “Regulators have their own timetables. There are overlapping projects, some of them similar, but not the same. It would be helpful if they were more aligned with each other – but at the moment that’s not really the case.”
Fortunately, there is a growing number of technological solutions that attempt to resolve some of the issues relating to lack of information and risk oversight. In September, data management systems provider GoldenSource launched GoldenSource Insight, a graphical dashboard and data visualisation tool designed to provide clear, real time information about processes, positions and risk exposures to help staff make better decisions.
Available on mobile devices, GoldenSource Insight provides real time dashboards and reports to help users make observations and analyse information and to take action with a drill-through to the GoldenSource enterprise data management suite. The service provides users with exposure analysis, exception management to help identify and resolve problems, and data quality, coverage and completeness views to help manage their compliance responsibilities.
“The raft of new regulations is changing the face of data management,” says Stephen Engdahl, senior vice-president of product strategy for GoldenSource. “Operations and executives require self-service capabilities to get answers to their questions about exposure, data quality and governance. GoldenSource Insight provides a clear, easy way for business users to effectively analyse processes, positions and risks, which improves transparency and enables many regulatory requirements to be met efficiently.”
While the credit crunch has arguably acted as a catalyst for greater transparency in financial markets, the costs incurred by attempting to reform have not been popular everywhere. That has
“Banks need to check they are doing enough. They probably aren’t.”
led some observers to suggest that a reluctance to properly invest in systems to prevent abuse and ensure compliance may be just as much to blame.
“This industry has a problem,” says Freeman. “Often, firms start complaining that it costs money every time they are asked to clean up their act. But reducing operational risk is like building a safer car. Safety costs money. Objecting to new rules because you don’t want to spend money is not a valid argument.”
Complex rules and a reluctance to spend precious funds can sometimes combine to create other kinds of abuse. Citing a pub in London’s East-end that serves alcohol without holding a licence (by selling beer mats and then handing over the drink for free) Geoff Kates, chief executive at management consultancy Lepus, warns that there is almost always a way to get around the rules. It is alleged by some other market observers that Standard Chartered had a branch operating in the Cayman Islands, removing references to Iran, and that this activity was based on explicit instructions from the firm’s senior management. For Kates, this simply illustrates that the more complex the rules, the easier they are to break.
“The whole culture of the banking sector is the problem,” he says. “When rogue trading incidents happen, everyone looks, and it moves to the top of the priority list. Then it gradually loses attention and slides into the background, until it happens again. Pay needs to be reduced across the board, and firms should try to use an infrastructure to flag up unusual or suspicious events, such as an individual trader suddenly making uncharacteristically high profits.”
While the technology available to counter AML is arguably getting better at finding suspicious payments, inevitably operational risk is a huge area. Monitoring every aspect of a major financial institution’s activities is a daunting task, according to Balani and it isn’t likely to go away anytime soon.
“Banks need to check they are doing enough,” he says. “They probably aren’t.”