Skip to content


  • Add this article to your LinkedIn page
  • Add this article to your Twitter feed
  • Add this article to your Facebook page
  • Email this article
  • View or print a PDF of this page
  • Share further
  • Add this article to your Pinterest board
  • Add this article to your Google page
  • Share this article on Reddit
  • Share this article on StumbleUpon
  • Bookmark this page

Is your data centre up to speed?

Alan Cane 
Financial Times
2 April 2008

Data centres are set to come under unprecedented scrutiny, as governmental and non-governmental bodies seek to contain and reduce the IT industry’s prodigious appetite for electrical power.

A strong catalyst has been last year’s report to US Congress by the US Environmental Protection Agency. This revealed that in 2006, North American data centres consumed about 60bn kilowatt-hours, roughly 1.5 per cent of the country’s total consumption of electricity.

The report went on to say that existing technologies and strategies could cut typical server energy use by 25 per cent and that new technologies could reduce the burden even further.

Meanwhile, in Europe, the Joint Research Centre of the European Commission is at work on a voluntary code of practice for data centres aimed at reducing “electrical consumption in a cost-effective manner without hampering the mission-critical function of data centres”.

All of this is turning a harsh spotlight on the data centre, traditionally the less-than-visible powerhouse behind corporate and government computing and, in particular, on those older data centres hampered by design weaknesses and technology failings.

Liam Newcombe of the British Computer Society’s Data Centre Specialist Group noted recently: “As more pressure is applied to data centres to become more efficient, their operators will soon be targeted, measured, grouped or labelled by the efficiency of their facility.”

Osca St Marthe, of the Morse consultancy, points out that while there are no statutory guidelines or codes of best practice for constructing data centres, a number of organisations are building up a body of knowledge to assist centre operators make the best of their existing facilities or create new ones.

He advises: “Before spending a significant amount of money on a new data centre, there are housekeeping activities which will ensure you maximise the assets in your existing centre.”

The US-based Uptime Institute, which provides consulting services to more than 100 data centre operators, specialises in tracking centre energy costs to provide industry benchmarks, while the Green Grid, a global consortium of hardware and software vendors is developing methods of measuring data centre performance.

The heart of the matter, according to Steven Salmon, a principal adviser specialising in data centre management at the consultancy KPMG, is that data centres commissioned in the late 1990s or early 2000s followed different design rules from those which would be applied today, now that power efficiency and resilience are at a premium.

In those days, single-server technology was king. Today’s high density computing environments – racks of blade servers – are hungry for power which the design of the centre cannot sustain. “To change the design would require significant investment,” he says. “I know of data centres with plenty of space but which have run out of power.”

It means that centres only a few years old are already past their prime; indeed, some experts would argue that a centre that is more than four years old represents a threat to the health of the organisation.

A survey in the US carried out by the market intelligence group Aperture Research Institute (ARI) suggests that more than a third of the companies it canvassed are in this position.

It argues that these older data centres are ill-equipped to cope with the intense power and cooling demands of modern hardware: “This problem can only get worse, as the enterprise continues to adopt high-density hardware. For example, a recent ARI survey showed that one in five of new servers are blades. Blade servers, in particular, cause problems for data centre managers, as they are forced to contend with significant power consumption and intense heat generation.”

The ARI found that only one-third of the organisations it surveyed was beginning to plan and build new centres; leaving two-thirds fated to be more than two years away from operating a new facility even if it was already in the planning stage.

Not everybody would agree with the ARI’s more pessimistic conclusions. According to Dr Albert Esser, vice-president, data centre infrastructure for Dell Global, there are two things operators of ageing data centres should not do: “One is to rip and replace the entire infrastructure and the second is to build new facilities.

“With the current uncertain economic climate, combined with the need to be energy-efficient, the best approach is for customers to consider ways to prolong the life of their data centres – not replace them.

“By making the right decisions with regards to their IT infrastructure, customers can ‘reveal a hidden data centre’ within their existing facility. By taking a comprehensive approach and evaluating everything from the component level to the facility, customers can increase synergy between equipment, power utilisation rates, cooling and software solutions such as virtualisation.”

Virtualisation – running more than one application or operating system on each server – remains the technology of choice for reducing power, space and cooling requirements. Sandor Chandon of Interoute, the networking group, argues that virtualisation is the key to “greening” the data centre.

“Virtual server technology will be the most effective way to run services within the data centre. A data centre using physical machines rather than virtual servers would require 10 times more rack space and significantly more power.”

The difficulty of updating existing data centres leads to some ad hoc practices, he says.

“It is hard to move and change equipment in a room that requires a constant temperature and still meet service-level agreements,” he explains. “This can lead to haphazard, short-term solutions: portable air-coolers and back doors left open to aid cooling are not unheard of. Racks are placed under general office desks because of lack of space in the data centre.”

Alastair McAulay, IT systems specialist with PA Consulting, has witnessed similar horror stories, where IT infrastructure has been exploited “until it is squeezed dry”.

“With this type of mindset, data centres – if they can be called that – end up relegated to basements in head office or slotted into a few spare disconnected rooms scattered around an existing building,” he says. “And somehow or other, the IT department makes this work.

“It is not unusual to find facilities where electric fans bought from a nearby shop are positioned around the floor to maintain an adequate airflow. If the thermometer rises to a certain point, servers are switched off.”

He says such arrangements can never meet the demands of modern business flexibility and that alternatives such as outsourcing have to be considered: “Whether outsourced or implemented internally, data centres are big cost-items with long lead times.

“Understanding the options for moving data centres and deciding if it is the right thing to do and how best to do it requires a combination of clear strategy and excellent execution.”

But that presupposes an understanding of the existing situation and the evidence is that too few companies can list their IT assets or assess their efficiency.

An international survey carried out by Quocirca, the consultancy, on behalf of Global Data Centre Management (GDCM), the London-based software group, shows that just under a third of data centre managers do not know how many servers they own or what devices are attached to the network.

Some 55 per cent of centre managers do not know how much power they are using or how much it costs, and less than half measure server utilisation: the rest guess their utilisation is about 75 per cent, while the industry average is known to be below 25 per cent.

Peter Armstrong, corporate strategist with BMC Software, thinks that skills such as capacity planning and impact analysis, neglected during the years of free spending on IT, will have to be relearned. New technologies such as virtualisation promise huge savings but present great risks.

“The question is: how do I manage this implementation in a controlled fashion, so that somebody cannot simply throw up a virtual image of the system, log on and do irreparable harm to the company,” he says.

His colleague, Leah Anathan, European business manager, believes that finding extra physical space and getting more power into centres has a new and welcome cultural dimension: “The line between facilities and IT is breaking down. At one time, there was no common language, as if the two functions were totally unrelated.”

Now, she contends, both have to be represented at discussions on the future of the data centre.

Contact the telecommunications team

By using this website, you accept the use of cookies. For more information on how to manage cookies, please read our privacy policy.