Consulting » IT Decisions – Utility computing – Pay as you work

IT Decisions - Utility computing - Pay as you work

Utility computing is being championed by the big names in the industry as IT's next big thing. The revolutionary idea of paying for your company's IT power in the same way that you pay your water or electricity bills could be just around the corner.

A new purchase requisition has landed on your desk. Apparently the office has run out of electricity, and you urgently need to spend a couple of million on a new power plant.

It’s not going to happen, is it?

So why do you have to open the corporate coffers every time you need new IT processing power, data storage capacity or software? If the latest trend in IT achieves its aim, within a few years you may not have to.

Some of the most influential IT suppliers are developing products and services to allow customers to buy computing as a utility, paying only for what you use, as you use it, in the same way as you would for electricity or telephone calls.

The implications of such a move are enormous. No more computer systems hogging expensive floor space, no more software failures stopping you from trading, no more pushy IT salesman with a deal to close every quarter.

Just a terminal on your desk, a plug socket in the wall, and infinite technology resources at the touch of a button.

It’s an immensely appealing concept with the potential to revolutionise the IT industry. IDC analyst David Tapper says utility computing services “could prove to be watershed”.

“The provision of computing services as a set of utility offerings could radically alter the cost structure of the services and place tremendous pressure on existing players to alter their service infrastructures,” says Tapper. “The outcome of this shift could be equated to Dell’s rise to leadership in the PC industry, a leadership built on a radically efficient infrastructure that altered the PC world and changed the players.”

IBM goes as far as predicting that the world’s first commercial utility computing service will be launched within the next 12 months, although it is likely to be targeted at a specific niche, probably pharmaceutical research.

Suppliers first started openly discussing the idea last year. At Hewlett-Packard’s (HP) launch of a new top-of-the-range server in September 2000, chief executive Carly Fiorina proclaimed, “We have placed a bet on the future of computing – it is utility computing.”

HP customers backed the initiative. Charlie Bell, vice president of infrastructure at online retailer Amazon.com called it “the internet dial tone”.

The first product from HP to move towards the utility concept was released in July; a pay-per-use software program for HP servers that charges the customer by working out the average processor usage each month. It’s a small step, even HP admits only 10% of its users are likely to want it, but its mega-merger with Compaq – another to put an exploratory toe into utility computing – could speed up development. HP’s move is one of a number of industry initiatives that will contribute towards the greater whole and merge over time.

The application service provider (ASP) model has the potential to develop into a utility service, but has a long way to go. With many ASPs, the only difference from operating your own systems is that the servers are hosted elsewhere and owned by someone else. In most cases users still have to pay for software licences. It’s a bit like an electricity provider offering specialised power to run your hair dryer or your television.

But in the US, some service providers have started to evolve a more utility-oriented model. In June, Exodus and Ejasent combined to launch a pay-as-you-use service for access to applications running on Sun Microsystems’ Solaris operating system. The previous month, another US outfit, NaviSite, introduced a buy-on-demand service to cater for excess web hosting capacity.

Software giant Oracle is investing heavily in what has become known as “software as a service”, which takes the ASP concept a step further. Oracle offers certain functions from its 11i application software via the Internet as a monthly rental service, and has stated that it intends to offer its complete e-business suite in this way. Specialised start-ups such as salesforce.com are offering access to web-based software applications for as little as $50 per user per month. And Microsoft plans to introduce online services charged on a subscription basis as part of its .Net and Hailstorm initiatives.

None of these are fully fledged utility computing services, but they do show that a gold rush is starting and suppliers are dashing to stake their claims.

“It’s all about winning users’ hearts and minds,” says Tapper. “These companies have an opportunity to establish themselves as first movers, which can many times bolster mindshare. The advantage is leveraging the short-term wins that can result from building customer awareness and then leverage these wins to build the economies of scale that will be required to support the expected demand for these services,” he adds.

But most of these tactical or limited-scale initiatives are put into context by the plans of arguably the only company with the infrastructure to make computing as easy to use as telephony – IBM.

In August, IBM announced plans to build on its experience in the scientific and academic research environment to offer what it hopes will be the capability to build a genuine utility service.

It is using a technology called Grids – networks of massive computers that appear to a user as a single, enormous processing entity. Grids allow organisations to share geographically distributed applications, data and computing power over the Internet. It means one company’s data could be used in another’s applications running on someone else’s hardware regardless of the technology.

“The grid allows you to transparently manage the IT infrastructure and share knowledge and resources. You could then form virtual organisations to work together with other organisations,” explains Daron Green, IBM’s European head of grid technology.

Grids operate in the academic community today. IBM is working with the UK government to create the national grid for collaborative scientific research, building the network with nine UK universities. The UK grid will link to CERN, the European nuclear research organisation, to allow scientists across the UK to work together on sophisticated high-energy physics research.

“The UK is clearly taking a leadership role in the development of grid computing,” said David Turek, IBM’s vice president of emerging technologies.

In the US, IBM is working with the National Science Foundation to build the world’s most powerful computing grid, known as the Distributed Terascale Facility (DTF). It will connect more than 1,000 computers at four separate locations, hold 600 terabytes of data (the equivalent of 146m full-length novels), and achieve 13.6 trillion calculations per second. IBM claims DTF will be the world’s fastest research network, allowing scientists to search for new breakthroughs in life sciences and climate modelling.

Work on DTF will not start until the third quarter of 2002, but the experience IBM will gain goes a long way towards creating commercial utility computing grids.

The key to making the grid work is software developed by Globus, a company that develops software as part of the open source community – a global, collaborative linking of developers working together on free-to-use applications.

The Globus software will provide the “glue” that seamlessly stitches all the computing resources together.

Globus software is developed for the academic environment, so IBM is converting it into a stable, robust product suitable for the commercial world. It will be combined with IBM’s own Project Eliza, an initiative to create “self-healing” computer networks, capable of identifying potential problems and automatically reconfiguring the systems to avoid crashes.

The idea is very similar to the way the electricity national grid works, re-routing power to avoid broken cables.

IBM believes it is on the verge of turning the concept into reality.

“The technology is here and working and we are grabbing the opportunity by the scruff of the neck,” said Green. But the technology alone will not be enough. The DTF grid will cost $53m to build, for a highly specialised and focused market. The investment needed to build a widely available utility service could run into billions, and even IBM would be cautious about committing to that level of funding.

It took decades to develop global energy and telecoms utilities, and the IT utility will not happen overnight. It will start as localised initiatives, possibly within major corporations, and as it spreads each grid will be connected to the next, building into one worldwide entity. It’s exactly the way the internet developed – many experts are already calling it the future of the internet. With the web as a precedent, and with the appeal and demand likely to be huge, utility computing is the inevitable evolution and maturation point of the IT industry.

Once the momentum starts, its effect on IT suppliers will be revolutionary

“A shift of the IT industry towards computing utility is expected to result in fewer, yet much larger players,” said Tapper. “The big question is; should IT suppliers enter (the market) quickly or play a wait-and-see game? With the stakes so high, making the wrong move could prove disastrous, but making the right move could prove brilliant.”

Share
Was this article helpful?

Leave a Reply

Subscribe to get your daily business insights