AdSlot 1 (Leaderboard)

Truth 1.0

The time was when hard drive storage was very expensive. But that was then, and this is now – when storage is cheap as chips. And those chips are getting even cheaper all the time, as detailed in the rather obscure Kryder’s Law, which propounds that hard drive capacity per unit of fixed cost will double every two years.

And don’t we just know it. Estimates vary, but research group IDC said a couple of years ago that the human race had clocked up about 487 billion gigabytes of data. If this were to be printed as paperback books, it would create a pile stretching from Earth to Pluto 10 times.

Unfortunately, this explosion of data has not been matched by a commensurate realignment of the systems on which all of this business-critical information is stored. Typically, different departments have specified and deployed many of their own IT systems organically, based solely on local technical and operational procurement criteria.

As a result, data is locked into silos, and internal departments cannot communicate with each other and share vital information. In addition, it leads to data duplication, often with errors introduced, so records become corrupted and senior managers cannot be confident they have the accurate information they need to make critical decisions. If there are two systems offering different answers to the same question, which should be believed? What if there are three, four or more answers to the same question? The issue also opens up companies to potentially serious compliance and regulatory risk.

A symptom of this data dislocation can be seen in the way companies wrestle to control what Wayne Eckerson at The Data Warehousing Institute described as “spreadmarts” – where staff in different departments create multiple spreadsheets at different times using varying data sources and rules. We know many FDs rely heavily on multiple spreadsheets that are not linked.

The vendors of shiny IT “solutions” should stand up for their share of the blame in creating the mess. Research group Quocirca recently noted that many technology companies’ sales teams have been guilty of focusing on departmental-level managers in order to get fast signoff. From the vendors’ point of view, this practice was, in many cases, encouraged and became self-fulfilling. It is ironic that this situation arose largely because the suppliers could ramp up revenue by pushing integration software and services to sort out the mess they themselves had created, by selling the incompatible systems in the first place.

 

This lack of centralised data control is surprisingly widespread across both the public and private sectors. The origins of the problem go back to the run-up to the dotcom crash: many companies were blindly throwing money at IT systems with scant regard to their actual business value. This short-sightedness, stemming from a lack of communication between IT and decision makers, had – and continues to have – a devastating effect on the operational effectiveness of some organisations.

There can be no silver bullet, but forward-looking companies know that standardisation is key. XML must be used as the data encoding standard. Consolidation service-oriented architectures and web services means that all applications can be designed to access the same data. This approach prevents vendor lock-in as data can be ported from one system to another. Implementing master data management across such standard interoperable platforms lets organisations take reference data and create a single database that communicates with all the other data sets.

The answer is not more IT solutions, but creating common data standards and consolidating systems across standardised architectures. This is the only way in which companies can realise the full power of all this information, which should be their most precious asset.

Related reading

/IMG/350/328350/intelligence-database
yahoo_headquarters
/IMG/200/112200/fraud
/IMG/779/289779/cyber-security-2-web