X

Why 'big data' is here to stay

The demand for so-called big data among business, government, and scientific leaders has been building for years. They can now turn to IT and say "make it so."

John Webster Special to CNET News
John, a senior partner at Evaluator Group, has 30 years of experience in enterprise IT storage, spanning mainframe and open systems environments. He has served as principal IT adviser at Illuminata and has held analyst positions at IDC and Yankee Group Research. He also co-authored the book "Inescapable Data Harnessing the Power of Convergence."
John Webster
3 min read

Eight years ago, a friend and I were researching a book we would later call "Inescapable Data - Harnessing the Power of Convergence." We were after an understanding of what kinds of new information one could produce by blending data of different types and from different sources -- GPS data, combined with RFID, combined with data from a shipping manifest could be used to track shipments in real time for example.

In the process of doing our research, we interviewed many CEOs, CIOs, and others in leadership positions to see if they were aware of the new variety of data types from wired and wireless sources. Furthermore we wanted to know if they had any plans to use them to pursue new business opportunities or create new ways to enhance their working environments.

To our surprise, we found many executives who not only were aware of the new data richness available to them, but also had plans to exploit new data sources.

The CEO of a major metropolitan hospital was working with members of his IT staff to build a system that combined RFID with hospital patient data. Drug carts equipped with RFID sensors would be used to deliver drugs to hospital patients. Each cart would "know" which drugs it was carrying because each pill container would have an RFID tag on it identifying the specific drug. If the cart entered a room where a patient would have an adverse reaction if given the drug by mistake, an alarm would sound alerting a nearby nursing station that a danger existed. This type of monitoring has become standard practice today, but it was just a concept eight years ago.

In 2004, we found other examples in business, government, agriculture, and entertainment. However, we also found that, while there was demand to combine data sources to leverage a new multiplicity of data sources, systems were not yet ready and available to deliver on their visions.

If we were writing the book now, we'd be calling it "Big Data" and we'd be able to describe the many types of systems now available to turn these visions into reality. That's the first reason why I believe big data is here to stay. The demand among business, government, health care, entertainment, and scientific leaders has been building for years, and they can now turn to IT and say "make it so."

However, there is a second and perhaps more compelling reason: The analytics systems now being built to extract meaning from once unimaginable amounts of data -- sometimes delivering new insights in real time -- are moving toward imitating the way the human mind functions. They can sense. They can process multiple inputs simultaneously. They can focus on only the data that's relevant to a given situation.

Furthermore, they can be taught to continually ask new questions. It is a hallmark of human thinking to use a conclusion from one line of thinking as a starting point for another. We continually ask "Why?" We often search for a deeper understanding. The analytics systems now appearing can be given the same "desire" to explore, to use one end state as the beginning of another line of processing and analysis.

There is a tendency to see the big-data phenomenon as another turn of the hype cycle. Indeed, I've been told that a prominent marketing executive was overheard as saying, "Never has a term so vague meant so much to so many." Not long from now, big data will go the way of all technology hype cycles and become another chapter in computing history, following the ones on cloud, client/server, and the mainframe. But it will have spawned new computing systems -- ones that more closely think the way we do.