Skip to content


  • Add this article to your LinkedIn page
  • Add this article to your Twitter feed
  • Add this article to your Facebook page
  • Email this article
  • View or print a PDF of this page
  • Share further
  • Add this article to your Pinterest board
  • Add this article to your Google page
  • Share this article on Reddit
  • Share this article on StumbleUpon
  • Bookmark this page

The curious incident of the financial institutions’ computer systems in the meltdown

By Alan Cane
Financial Times 22 October 2008

It’s a puzzle to match Arthur Conan Doyle’s “curious incident of the dog in the night-time”; let us call it “the curious incident of financial institutions’ computer systems in the meltdown”.

The dog did nothing in the night-time and neither, it seems, did the computer systems, as the world crept closer to the abyss. No klaxons blaring, no red lights flashing, no digitised voice warning: “Mr Trader! I cannot allow you to bet the firm on these dodgy mortgage-backed securities.”

Commenting on the troubles of Fannie Mae, the US secondary mortgage company, financial blogger Barry Ritholtz notes: “Like many other firms, Fannie’s computer systems did a poor job of analysing risky loans.”

How could that have been? Financial institutions operate the most sophisticated computer systems outside the armed forces. They spend more on hardware and software than any other business. Computer systems are ideally suited to handling numbers, calculating ratios and managing the other factors involved in assessing risk. Yet it seems they failed to warn about the size of the hole banks and organisations such as Fannie Mae were digging for themselves.

An immediate point to make is that it should not happen again. A wealth of new legislation and regulation, together with the visceral shock the institutions themselves have experienced, should ensure that IT departments are going to be busy for the next few years, building business intelligence systems capable of assessing and responding to a much broader range of market conditions. Today’s systems, some of them more than 10 years old, were designed to operate in the growth segment of the business cycle – and that has changed dramatically.

To return, however, to the main point, it is perhaps unfair to criticise banking systems for a lack of omnipotence in predicting catastrophe. As Ian Berriman, financial systems specialist for PA Consulting Group points out, it was a classic case of “garbage in, garbage out”.

It is perfectly possible to put in place a computer system that monitors, say, the ratio of deposits to loans, and flags an alert if certain multiples are exceeded. But if the wrong ratios are selected, the warning may come too late or not at all.

“I suspect that, in this case, the right information was not being fed into the systems helping people make decisions,” he says. “It was a case of systemic failure rather than systems failure.” By which Mr Berriman means there were failures in the processes devised to manage risk, rather than in the systems built to support those processes.

In the past few years, financial instruments have also become increasingly sophisticated. David Sherriff, chief operating officer, and Martin Redington, product director, for Microgen, the computing services group, explain that the complexity of the data, in the bundles of mortgage debt marketed by the banks to provide funds for further mortgages, was too much for elderly systems to handle.

There was essentially no information about the quality of the individual loans in each bundle, making proper risk assessment almost impossible.

They argue that many of the financial institutions that collapsed fell because of a lack of transparency across their operations and their inability fully to comprehend their commercial liabilities and the risks associated with them.

So the systems themselves can be absolved of a big share of the blame.

New regulations will certainly be imposed. Mark Dunleavy of Informatica, another computer services group familiar with the financial marketplace, points out that, prior to the crunch, Basel II and the SEC’s Regulation AB were the standards for risk and asset-based securities.

“With the benefit of hindsight,” he says, “it is clear that these did not avert the current crisis. It seems that soon the legislative spotlight will turn on the securities market. If that is the case, providers would be advised to improve standards now.”

This will hardly be welcome news for many financial institutions’ IT departments, already overstretched by the pressure of building systems to handle money laundering regulations and Sarbanes-Oxley accounting standards. On the other hand, it may mean that risk management, traditionally less generously funded than front-office functions, will claim more of the IT budget. The risk management officer should see his or her star wax, as the new systems are put in place. There should also be sound commercial opportunities for specialist vendors of risk management software.

Most IT departments, however, are already under intense pressure to deliver business innovation while cutting or controlling costs. Many companies are, understandably, responding with caution to the crisis but those most likely to be successful in emerging profitably on the other side will be those who are capable of putting both processes and systems in place to cope with the new world order. 

Contact the telecommunications team

By using this website, you accept the use of cookies. For more information on how to manage cookies, please read our privacy policy.