From reading the latest vendor announcements telling us that Software as a Service (SaaS) and cloud computing are going to transform how IT is provided to the enterprise, it is easy to forget recent history.
It was only five years ago that Gartner was predicting that by 2008 utility computing (a broadly similar concept) was going to be in the mainstream. At the time, there was a huge wave of publicity from the big vendors about how utility computing was going to transform the way IT services were provided.
Plainly, utility computing has not yet become a mainstream proposition. So what has happened to it in the past five years, and will things be different this time round?
Five years ago, many of us in the industry were working with clients, helping to set up some of these utility computing sourcing deals. It soon became clear from the fine print that on-demand services were available only under certain circumstances, such as with applications hosted on mainframes. That was hardly earth-shattering when you consider that mainframe technology has allowed virtualisation and charging-per-use for decades.
Of course, there were differences. But fundamentally, the deals were never as exciting as they were hyped up to be. A large part of this was due to the fact that the necessary virtualisation technology was not mature enough to allow the foundation IT infrastructure to be run as flexibly as it needed to be for true on-demand services.
This time round, the vendors are playing a subtly different tune. They seem to be saying: “With virtualisation technology, such as the near-ubiquitous VMWare, now fully mature, we really can deliver utility computing. Trust us.”
However, we are finding that while businesses are more than happy to use virtualisation technology within the boundaries of their own organisations for its flexibility and efficiency gains, they are not really paying much attention to utility computing, be it SaaS or cloud computing or whatever it is being called this week. Indeed, according to silicon.com’s recent CIO Agenda 2008 survey, utility computing was languishing at the bottom of the wish list next to Vista and RFID.
But let us set this indifference aside and, for the sake of argument, assume that we trust some of what the vendors are now saying, and that we are disciples of Nicholas Carr, author of “Does IT matter?” and believe that IT can be purchased as a utility. So what should we do now?
The first thing to do is to add a word to the title of Mr Carr’s 2003 Harvard Business Review article to make it “IT doesn’t matter . . . sometimes”. The challenge therefore becomes working out what “sometimes” is. There are three steps that need to be taken to define this IT strategy for the real world
Step one is to recognise that within most organisations of more than 100 people there are unique business requirements where the IT involved does have to be tailored to individual circumstances. This may be because of unique regulatory and security requirements that strictly stipulate where data is stored and processed (eg, the public sector).
Perhaps more significantly, it could be because IT can be used in innovative ways to provide a competitive edge. The big banks, for example, will put a lot of effort into running their proprietary risk models quicker than their rivals – even down to developing their own hardware. In these instances adopting a utility computing model and migrating on to processors and storage somewhere or other in the world is not going to be viable.
The second step is to acknowledge that there may well be cases where legacy information systems are working pretty well. Typically, this is IT that was installed around five years ago (and we have seen 15-year-old systems that continue to work well). All the painful deployment wrinkles have been ironed out and if you take care of the underlying hardware, the systems more or less run themselves. Why bother replacing when the cost and pain of moving the system to a new utility computing platform is not going to achieve the payback within a sensible timescale?
The third and final step is to identify where there really are areas that IT can be sensibly commoditised and handed over to an IT utility service provider.
These are likely to be areas that do not have an unusual business process, that are not sensitive to user response times, where there are no legal implications for data storage, and where availability is not super critical (Amazon’s much publicised “Elastic Compute Cloud” gives 99.9 per cent availability – which translates into more than a minute of downtime a day).
We can also say from PA’s own experience of using the Second Life platform that businesses should make sure any scheduled downtimes – designed to minimise impact for Californian business – do not disrupt you too much.
Given the above, you may well conclude that for now you can use utility computing to do some one-off business data analysis that requires a burst of server and CPU resource, or you may decide that your sales team can easily be supported by the Salesforce.com service without risk to the business.
Being a cautious early adopter is not a bad place to be with utility computing. Despite the rebranding exercises and the continuing hype, this time around there may well be some benefits available, even when the real world constraints are considered.
However, it would be a very foolish CIO who bet the business on a major shift to the utility computing model. That would mean ignoring why IT still does matter to the business, and will still do so in five years, when we are calling utility computing something else again.
Alastair McAulay is a senior IT consultant at PA Consulting Group