• Phone
  • Contact us
  • Locations
  • Search
  • Menu

share

  • Add this article to your LinkedIn page
  • Add this article to your Twitter feed
  • Email this article
View or print a PDF of this page
.

Algo trading: the dog that bit its master

Ross Tieman
Financial Times
19 March 2008

In the search for someone to blame for the problems gripping the world's financial system, there is one scapegoat that cannot answer back: the machines.
 
For a decade, investment banks, traders and exchanges have been engaged in a technology arms race, in which the fastest computers net the biggest profits. This year alone, Aite Group, a Massachusetts consulting firm, predicts that capital market players will invest $41.8bn in IT, and reckons spending growth will continue at an average 9.8 per cent a year.
 
Automated algorithmic trading programs now buy and sell millions of dollars of shares time-sliced into orders separated by thousandths of a second.
 
Desk-tapping dealers in exotic collateralised instruments count the minutes while a dozen servers, working in unison, calculate millions of permutations to price a complex deal in a race against the computers of rival trading houses to win the client.
 
The data flows have become spates. And amid the torrent, computer-generated value calculations have been unchallenged. Deals are done at the speed of light. Data consolidation often takes a back seat. Surely this is where some blame might lie for the credit crunch.
 
"Innovation in certain cases leads to uncontrolled automation," concedes Andrew White, global head of trade and risk management at technology and information group, Reuters. In volatile markets, the risks can change rapidly and hugely within the day. "Many risk systems weren't able to keep up," he says.
 
It is little wonder that many banks' risk and technology officers are feeling sore. Computerised trading technology is the dog that bit its master.
 
Over the past decade, a global switch to electronic trading has changed the nature of the business. Banks and their traders used to make profits buying and selling very simple instruments - foreign exchange, equities and bonds. Going digital exacerbated competition and drove down margins.
 
As profit margins fell, banks sought to reduce costs and win competitive advantage by automating routine trades.
 
"Ten years ago, a bank would take a big trade on to its books, and would have large numbers of traders trying to execute the deal in small chunks without moving the market," says Jeremy Badman, partner in the strategic IT and operations practice focusing on investment banks at Oliver Wyman.
 
Today, he says, many big trades are fed into computers running "algo" programs, which then execute them automatically in small packets. The banks' mathematical boffins, the quantitative analysts, soon dreamed up other algos, capable of spotting split-second arbitrage opportunities between prices, and trading on them for profit before a human can blink.
 
Algos have also been created to trade upon news, using special programs to scan incoming agency reports for key words - relating, say, to a change in interest rates - and enact deals based on market responses to similar past events.
 
Growing demands for machinereadable data prompted Reuters and Dow Jones to launch rival services last June that deliver some news in a format more readily digested by machines, enabling algo programs to respond still faster.
 
Algo trading already accounts for an estimated 40 per cent of trades in US markets, and the proportion is growing fast. London algo trading rates are accelerating, though perhaps a year behind those in New York. Meanwhile, algo volumes are rising on Indian and Asian markets too.
 
Algo dealing has put exchanges under pressure to deliver faster trading, and encouraged the creation of so-called electronic communications networks, which provide alternative platforms for finance houses to trade assets.
 
That is because liquidity flows to the fastest market, aided both by automated trading and by regulations - Mifid in Europe, and the Regulation National Market System in the US - that require trades to be conducted in the market offering the best price.
 
The role of dealers has changed. Some have become market Rambos, selecting algo programs like weapons in response to market events.
 
Mr Badman, the consultant, says: "It's like an Airbus. You have a computer that sits between the pilot and the flaps and the engines. The pilot says 'I want to go in that direction', and the computer decides how to do it."
 
Other dealers trade electronically, but manually. And those in oddball equities, or assembling complex package deals, may still rely on the phone.
 
Brian Traquair, president of capital markets and banking at systems company SunGard, says: "We see situations where an individual trader needs 10 or more processors simply to keep up with what he or she is doing.
 
"Traders need to take information from many sources and push it out to many people at other companies as fast as possible. That becomes an information processing problem."
 
Meanwhile, the ability of computers to do lots of little deals very fast has changed the nature of markets. Kirsti Suutari, global business manager for algorithmic trading at Reuters, says that whereas the average trade size on US equity markets in 1996 was more than 1,600 shares, today it is fewer than 400. So the number of trades has increased four-fold, merely to deliver the same volume.
 
That, too, causes a huge increase in the flow of data, which under Mifid regulations now also has to be stored for years.
 
Mr White, Reuters risk chief, argues that falling trading margins on simpler instruments helped drive financial institutions to replace lost profits with more complex products such as collateralised debt obligations.
 
Because they are little traded, valuing these is difficult. Equities, you can "mark-to-market" - that is to the market price - at the end of the day. But the value of CDOs and other structured investments is derived from computer models drawn up by quantitative analysts. Some of those models failed to take proper account of the "unlikely" events - such as a national fall in US house prices and soaring defaults - that have since occurred.
 
The mountain of data generated by these machines has become a problem in itself. "At the end of every trading day, the risk management system tries to work out the bank's net position, to help inform the next day's trading strategy," says Mr Badman.
 
"Because of the increase in trading volumes, and their growing complexity, it was taking more time than was available overnight to do the calculations." Some banks responded by investing, others by leasing server time elsewhere.
 
Unlike humans, computers do not, in theory, make mistakes. But by processing so many trades and valuations, so quickly, they can turn a tiny human error into a catastrophe before it is spotted, especially when an organisation is only totting up its net position once a day.
 
The only solution lies in more technology. "For traders," says SunGard's Mr Traquair, "the real challenge is to make sense of this fire-hose of information that is squirting at them." They need computers with more memory, and faster technology, that can cope with interruptions in data flow, he says.
 
Ian Berriman, financial systems specialist with PA Consulting Group, says: "The inclusion of a human in the loop would be desirable to guard against rogue algorithms, but this is impractical. In practice, the human is likely to play an active role only in those markets where the exchange actively seeks to constrain the use of algorithms.
 
"Exchanges are seeking to increase execution speeds and reduce network latency to retain their market position, but they have not invested similarly in market monitoring mechanisms that can intervene before algorithms spiral out of control.
 
"This lack of a 'dead man's handle' could eventually prove terminal for some markets. To protect their business interests, trading firms must implement systems and controls to validate, monitor and manage their own algorithms.
 
"Exchanges and marketplaces must play their part, too. They must monitor the behaviour of market participants' algorithms and, if necessary, suspend the entire market when it moves inexplicably."
 
Meanwhile, vendors agree, it is going to be a great year for sales of risk management technology.
 
The drive now is to develop and put in place real-time, dynamic, risk assessment. Only by constantly gathering key data on trading and valuation exposure from all these disparate systems can risk managers be sure neither machines nor humans are putting the institution at risk. "It's like putting sensors across an organisation", says Mr White.
 
David Sheriff, chief operating officer at business process management experts Microgen, concurs. "The financial services industry needs to find the happy medium between the need for speed and the right level of visibility if they want to prevent major trading gaffes," he says.
 
Even if there is a serious downturn, financial institutions will continue to invest in ever-faster systems because they are locked in a technology "arms race", says Mr Badman. But the overriding new priority is to develop and install systems that can limit the damage those ever-more-powerful trading weapons can cause.

NEWS UPDATES

Sign-up to receive company updates and press releases by email or newsfeed:

SIGN-UP

 

   
Corporate headquarters
123 Buckingham Palace Road
London SW1W 9SR London SW1W 9SR
United Kingdom
Tel: +44 20 7333 5865 Tel: +44 20 7333 5865
contact us now

By using this website, you accept the use of cookies. For more information on how to manage cookies, please read our privacy policy.

×