Digital Transformation » Technology » The big Data

COMPANIES AND GOVERNMENTS everywhere are witnessing a growing data deluge compounded by a rising tide of complexity. No matter how large or small an organisation, there is no escape. Big data is here and it is not going away anytime soon.

An explosion of data volumes, formats and types promoted by global markets, technology, competition and customer demand has managers struggling to cope. Organisations mistakingly recruit more bodies, but old thinking and old solutions won’t work. Bureaucratic problems are never solved by adding more bureaucracy.

It is probably the first time in history that our limited human abilities have been outpaced by technology and change: our rate of comprehension and adaptation is not up to the job. Coping with increasing speed and complexity demands more agility and sophistication than we can muster.

Automation, communication and computing have realised a chaotic world of surprises. How come? Old business models and management systems are unsuited to this modern environment. Human decision loops are too slow and comprehension is limited, and until we adopt new systems and new modes, bad decisions and unintended consequences will be the norm and not the exception.

We understand our ability to respond slows down along with our ability to adapt as complexity increases. This all gets rapidly worse as the number of dimensions increases, and yet we have to cope with more transactions, data feeds, product variants, supply chains, customer demands and so on.

Perhaps worse, this is happening in a changing legal framework augmented by dynamic regulation, government controls and competition. The only way we can buy more time is through automated analysis. Combining this with business modelling and decision support is the only way we can cope.

Annual or monthly outcomes and forecasts are being replaced by weekly and daily updates more in keeping with today’s pace of business. In some cases, this need has already become by-the-hour or by-the-minute updates.

How can any of this be achieved? It is unlikely that IT departments possess the skills necessary and outsourcing seems to be the only option. But how big is this problem? For the large retailers, we are talking Mbytes/minute, while banks see GBytes/minute, and it can be TBytes/minute for the oil and gas industry.

Business data now spans the simple bar code and the seismic. Are the analysis products available to purchase? No. Are the basic tools available? Sometimes yes and sometimes no. However, it is fair to say that manufacturing, logistics and retail look set to be well served, and this new service sector is developing fast.

The growing CIO challenge is the collection and storage of data and the construction of accurate models. Creating data and analysing it in great detail is a waste of time and money if it is not fed back into decision processes providing support. Probably the biggest danger is discarding data deemed irrelevant that may turn out to be vital. The good news is that storage is inexpensive. So my advice is: store everything and think hard before you hit delete.

For many, this advice will be easy to subsume, but there are exceptions. Oil and gas struggle with the sheer magnitude of their data capture and that delete key might prove tempting, but there is most likely ‘oil in that data’. Other sectors face more subtle challenges.

We don’t understand the implications of social networking. What we do understand is the sociology of people, but the sociology of ‘smarts’ will no doubt elude us until we have gathered sufficient data. Certainly new complexities, but new opportunities too. ?

Peter Cochrane is an IT consultant and former chief technologist at BT