Skip to main content

The Internet of Things Drives Big Data

By April 15, 2014Article

Big Data certainly cannot be condensed to one usage or to one type/source of data. Many use cases that flourish in organizations are based on a newly uncovered ability to get to, and process, data that was not accessible before. Often, this data enables the organization to explore new dimensions of its business model or to enrich existing business processes with new insight. 

Such “new data” include data generated by connected objects — the Industrial Internet, or to use a more contemporary term, the Internet of Things. Traditionally lying at the boundary of information technology, these connected objects belong to a domain called Operational Technology (OT). Some of these objects have been in existence for decades — manufacturing chains, airplanes, HVAC systems, access control systems to name only a few. Some are newer: smartphones, fitness trackers, intelligent scales, smart meters, medical implants, etc. 

Data from the Internet of Things is not Big Data only because of its volume, although connected objects can cause volumes to literally explode. Consider smart meters: instead of one reading per quarter, they will generate a reading every 15 minutes. That’s approximately 10,000 more data points per subscriber! New-generation airplanes such as Boeing’s 787 Dreamliner create terabytes of sensor data per flight — to be multiplied by thousands of airplanes in airlines’ fleets. 

Data from the Internet of Things is also Big Data because of the new use cases it enables. So-called “intelligent buildings” (which include intelligent elevators, intelligent HVAC, intelligent access control, etc.) optimize energy use based on actual occupancy. Fitness trackers help individuals better control their eating habits and sleep patterns. Medical implants can detect early warnings of life-threatening situations and call for help. 

Implementing such use cases requires advanced research on this new data. They entail exploration, intuition and trial and error by data scientists. Once models and algorithms have been defined, they need to be operationalized — implemented in real time. 

And reliability is essential. Even if it’s safer to ground an airplane by oversight than letting it fly, even if it’s better to call an ambulance for a patient who doesn’t need one rather than not calling one, these errors come with a cost. And even worse, they can kill the credibility of the concept. Remember “The Boy Who Cried Wolf?” 

Clearly a new driver for Big Data, the Internet of Things also creates new challenges for Big Data. 

Yves de Montcheuil is vice president of marketing at Talend, the recognized leader in open source integration. Yves has 20 years of experience in software product management, product marketing and corporate marketing. He is also a presenter, author, blogger and social media enthusiast. Follow him on Twitter.

Copy link
Powered by Social Snap