AdSlot 1 (Leaderboard)

IT strategy: Trouble in store

Albert Einstein famously observed that technological change is like an axe in
the hands of a pathological criminal. And many finance directors could be
forgiven for thinking they are on the wrong end of this axe, wielded as it is by
IT departments demanding firms pay through the (bloodied) nose for the latest
and greatest technology.

The blistering churn of IT innovation is more than enough to create very
serious headaches for unwary companies. An excellent and sobering example of the
way the march of progress can lead to sudden and unexpected technological
obsolescence came into focus recently, after the launch of rival next-generation
DVD formats.

In the blue corner was Sony with its Blu-ray technology; in the red corner
was Toshiba with its rival HD DVD format. Despite winning the backing of
industry giants including Microsoft, after only a few short months, the HD DVD
format was dropped entirely, leaving all those who had purchased players or
discs with, respectively, expensive paper weights and coffee coasters.

Just ask anyone who has all their favourite films on Betamax tapes what they
think about technological change. While the death of the Betamax and Blu-ray/HD
DVD formats most directly affect consumers, businesses are even more at risk of
falling foul of technology’s inexorable march as the data locked into their
systems is so much more valuable.

So a recent low-key announcement that some boffins in the US were developing
technology to allow long-term storage of data caught the eye. The researchers at
Baskin School of Engineering at the University of California are developing a
new approach to replace ageing data storage technologies, such as tape
libraries. Called Pergamum, the technology uses hard disk drives to provide
energy-efficient, cost-effective storage. The idea is to create a distributed
network of intelligent, disk-based storage devices.

Though not trendy or fashionable, the work of these boffin types is of
critical importance for any company that uses computers and stores digital data.
The scientists are trying to solve the problem of how to build a large-scale
data storage system to last, not five years into the future, but 50 to 100
years. In IT terms, this is several lifetimes ­ if you can imagine how
information technology looked in 1908, then you can get some sense of the
monumental scale of developments that are likely to emerge in such a
mind-bogglingly long time frame.

But the hardware side of things is, unfortunately, the relatively easy part
of the data storage problem for corporates. Far more challenging in terms of
keeping data accessible for the future is selecting the right software format.

Storage of something as seemingly innocuous as electronic office files
including spreadsheets, charts, presentations and word processing documents
suddenly turns into a potential minefield. It is vitally important to ensure
that this business-critical data remains accessible in the future, after the
applications that were used to create them have long been superseded.

To achieve this there are a number of choices: should the files be stored in
a format conforming to Microsoft’s OpenXML (OOXML) standard which has recently
been ratified? Or, instead, opt for the OpenDocument format (ODF) touted by the
Office XML technical committee, which is backed by ­ among others ­ Sun
Microsystems and IBM?

However, in addition to HTML, ODF and OOXML there is a veritable alphabet
soup of available formats in which data can be stored. The list, which has the
capacity to bewilder even the most technically minded, includes Darwin
Information Typing Architecture, Encoded Archival Description, DocBook,
Extensible HyperText Markup Language, Maker Interchange Format, Open
Mathematical Documents, Text Encoding Initiative, troff (typesetter runoff) and
groff (GNU runoff). And this is before we have even looked at storage of core
financial and business modelling data in formats such as XBRL.

There are many technologies that seem so much a part of our everyday
corporate IT infrastructures, it is all but inconceivable that they would ever
disappear. However, the transitory nature of many of these seemingly core IT
components was brought home to me recently when I interviewed Bob Metcalfe, the
inventor of the ubiquitous Ethernet networking protocol.

He kicked off by admitting that the technology he had so famously invented
was doomed. He did quickly add the important caveat that the technology which
would inevitably take over from Ethernet and replace it would be called
Ethernet. However, this does not alter the fact that a technology seemingly so
fundamental to corporate IT infrastructures as Ethernet is in a state of
permanent flux.

In this highly dynamic environment it is clear that savvy firms must take
steps to implement data storage strategies that are not only capable of
satisfying their business needs today, but which are flexible enough to
accommodate future developments. Failure to achieve this will mean that
business-critical data assets could quite literally be lost forever.

Plus ça change, plus c’est la même chose.

Related reading