Digital Transformation » Systems & Software » Grid computing – Grid and bear it

Grid computing - Grid and bear it

Distributed computing is an idea born in the 1960s. But it has taken astronomers in the 21st century to show that connecting idle PCs together can create supercomputers.

When it came to funding IT projects, finance directors rarely kept their chequebooks in their pockets in the latter half of the last decade.

Spending on emerging technologies exploded, as organisations rushed to web-enable their customer-facing systems and overhaul their creaking back-office IT to fend off the Millennium Bug. But the harsh reality was that most of the technology was not needed.

Well, like a brewery diversifying into hangover cures, the IT industry says it has a tonic for businesses that overindulged in the 1990s – grid computing. The idea is simple yet compelling.

At any given moment in a big company, there will be a few computers struggling to handle a particular workload while all the remaining machines remain idle. By connecting computers so that processing and storage capabilities are pooled, spare capacity can be shared out when needed.

Recently, IT infrastructures have become incredibly complex and inefficient, making the integration and management of it all an enormous and expensive challenge. Scott McNealy, CEO of Sun Microsystems, admits that things have got out of hand. “A systems administrator today can manage somewhere between 15 and 30 systems, but it should be 500,” he says. “It takes weeks to deploy a new network service, whereas it should take days or hours. We’d prefer it if our customers were able to spend that money on computer power, storage capacity and increased bandwidth.” Grid, he says, is one answer.

While his last remark betrays a motive of self-interest, most IT directors confirm that much of what McNealy says rings true. Sun, in fact, claims to have increased utilisation rates within its own IT infrastructure from about 20% to almost 90% since deploying a number of internal grids in the past two years.

The idea behind grid is not especially new. In the 1960s, computer scientists developing ways to transfer files between networked computers also began toying with the notion of IT as a shared utility. It remained a fanciful dream for nearly 30 years, before the release of distributed computing and resource management software toolkits.

These and other advances helped grow grid’s momentum at the end of the 1990s. Altruism, as well as commercialism, drove some of the early schemes and brought grid some valuable media attention. One typically grandiose project, SETI@home, set out to draw on millions of idle PCs to power a radio telescope searching for alien life (see box). Another was founded to power computers researching a possible cure for Aids.

Elsewhere, many research institutions in the US and Europe began linking up their machines and sought grid computing partnerships with the private sector. In one example, three English universities – Leeds, York and Sheffield – created a grid of hundreds of computers and servers and invited businesses as well as students to tap into the resource. Shell and Rolls-Royce, among others, were happy to oblige.

So-called internal grids began emerging too. Organisations with large computing requirements connected thousands of PCs, workstations and servers within the firewall. They found it was like having access to a supercomputer without actually having to buy one. A number of large financial institutions, including US investment bank JP Morgan, are exploring this approach to computing.

In the last two years, there has been an increase in IT industry hype around the theme of grid computing. In one of the biggest gambles in its long history, IBM says it is investing some $10bn on the concept. Other big guns, including Sun Microsystems and Hewlett-Packard, have invested considerable intellectual capital in the concept.

Their combined marketing message is simple: grid computing is ready for prime time. This, they say, is because two important developments have coincided. First, finance directors and CEOs are reluctant to sanction a fresh increase in IT spending. Second, grid’s constituent technologies have matured.

Observers say the industry has got it only half right. Yes, many organisations are reluctant to raise IT expenditure, but no, grid technologies are far from mature. Immature technologies have three things in common: they are unreliable, insecure and expensive. The last is the most damning in grid’s case.

IT budgets may fall as a result of grid computing, but it is unlikely to be until the medium term. The underlying technologies for grid have to be acquired first, often at considerable cost, and there may be the need to upgrade other hardware first. Since there are so few grid deployments, integration and project-consulting fees are likely to be high.

The very nature of grid computing creates other problems. For grid to work, disparate systems must be compatible. That has led some critics to suggest that organisations will feel compelled to deal with only one grid supplier. In the past, when so-called vendor lock-in was common, customers complained about high prices and poor service.

And organisational and political obstacles exist. “In many companies, culture issues are keeping companies from becoming more efficient,” says Thomas Bittman, a Gartner analyst. A recent survey of 50 global Fortune 2000 companies, commissioned by Platform Computing, supports this view. It found that internal political issues stood in the way of its widespread commercial acceptance. One such issue is known as server-hugging – the reluctance of departments such as HR and marketing to give up control of servers they consider their own to a centrally managed grid.

There is a potential conflict with the IT department too. As happened when IT outsourcing began to take hold in the 1980s, many technical staff feel threatened by grid. But there is one clear distinction between IT outsourcing and grid computing that makes the latter’s threat even more potent. In most outsourcing contracts, some IT staff members are transferred to the outsourcer as part of the deal, but there is no staff transfer with grid computing. This has fuelled speculation that suppliers will bypass IT departments and pitch contracts directly to the FD or CEO.

The obstacles to grid’s widespread adoption means that “companies are missing opportunities to reduce costs, leverage existing IT resources and increase competitive advantage,” says Ian Baird, Platform’s chief business architect.

“We are confident that technological challenges will be overcome. The real challenge is getting people to collaborate and share resources.”

In the private sector, there may be fewer than 10,000 companies in the world that have employed some form of grid computing, and only a handful of those are using grid in the manner in which it was originally intended.

In that context, the failure of grid to progress beyond the early adopter stage seems less surprising, and IBM’s $10bn punt on the technology begins to carry an element of market-making – a revival of the discredited notion of ‘build it and they will come’ that so helped to inflate the dotcom bubble.

IT PHONE HOME
The Search for Extraterrestrial Intelligence (SETI@home), conceived in 1996 by two American scientists, is a project that analyses the millions of parcels of data received by the Arecibo radio telescope in Puerto Rico. The SETI team encourages the public to download a special screensaver that enables common or garden PCs to analyse astronomical data when they are idling and then send it back to SETI. The aim is to discover evidence of extraterrestrial life.

The project was launched in May 1999 and was the first example of global-scale distributed or grid computing – linking idle computers via the Internet to tackle large mathematical problems.

SETI has captured the public’s imagination. Sponsored by The Planetary Society and Sun Microsystems, among others, the project has over four million PCs running computations, has received 833 million results and has clocked up 1.4 million years of computing time. “This is the culmination of more than three years of computing – the largest computation ever done,” says David Anderson, director of SETI.

But despite the huge scale of the project, SETI’s scientists do not believe that signs of extraterrestrial life will be found for another 20 years.

The SETI@home project can be found at https://setiathome.ssl.berkeley.edu.

Share
Was this article helpful?

Leave a Reply

Subscribe to get your daily business insights