Internet of Things

Achieving Greater Value from Real-Time Analytics and Correlated Data

  • author image

Editor’s note: Where are we headed with real-time analytics systems and IoT devices? How will it change business decision making? And what will be the impact on the need for data scientists? Don DeLoach is CEO and president of Infobright, provider of a platform for storing and analyzing machine data. Don is a presenter at the “Real-Time Analytics” session at the upcoming IoT Evolution Expo in Florida (SandHill.com is proud to be a sponsor), and I spoke with him about how the IoT is affecting the evolving use and capabilities of real-time analytics.  

What do you think is the most important aspect in the way real-time analytics systems and IoT devices are evolving to enhance data for greater value?   

Don DeLoachDon DeLoach: I think the thing that’s happening that is worth noting is that the real key to value for the IoT lies not so much in the closed-loop message response systems (based on if this, then do that, thinking) but in leveraging the utility value of the underlying data to enhance everything. 

Most IoT subsystems are fundamentally configured first and foremost to be real-time systems that gather their data in stream. The sensor pulsates readings, which then stream into a rules engine that interprets them and triggers a response. Most people aren’t really thinking much beyond the stream, which is where I believe the opportunity and maturity of IoT will be seen. I think it will evolve to a bridge between real-time systems and the elaborate persistent store and analytics of that data. Ultimately, the collection of sensor data may be used by a variety of constituents in a variety of applications. The key is the abstraction and the ability to leverage the data along the way, starting with the first receiver.  

Please share an example of leveraging the underlying data and leveraging multiple constituents to provide greater value. 

Don DeLoach: Let’s say I own a restaurant. In it I have 10 different IoT subsystems including a lighting system, an HVAC system, a beacon system, a system for my kitchen equipment and a Point-of-Sale (POS) system. I know how many people are in the restaurant at a particular point in time and what the temperature and lighting are in the store at that time. With these systems I get information in real time about the condition of the restaurant, and I can use the data from the systems to help me manage my operations better and more cost-effectively.  

But if I store that real-time information over time, I can blend it with things like my inventory level, my crew schedule, information I get from the city about people or vehicles on the street and the weather. Now I have a much richer set of information that I can cleanse, enrich and use as the baseline by which I can determine how to optimize the store operations.  

Over time I can dynamically use machine learning to begin to adjust – in real time – the lighting and temperature and indicate when the crew needs to start cooking different things in anticipation of the order flow so I can increase sales. This would be a real-time response mechanism that is facilitated by much greater insight into a broader array of data that is possible because of the cleansing, enriching and combining of various IoT data along with other operational data.  

Likewise, the corporate offices might want to see some variation of that same data. The capital equipment vendors will want the data pertaining to their equipment in the restaurant, and there may be third parties such as supply chain partners or government agencies like the FDA that may want other variations of the same underlying data, each for their own applications and their own purposes.  

Has there been much adoption in this activity of combining stored data this year, or is this a prediction for 2016? 

Don DeLoach: It’s beginning this year. We’re now moving to another phase, where people are moving from a focus on the closed-loop message response systems to a point where they are beginning to recognize the value of data in the context of other data. It’s still early days, but there are definitely examples.  

For instance, people now think in terms of Fitbits and wearables. And that’s fine. But if you Google “IoT and neuroscience” or “sophisticated wearables for healthcare,” you can see that the types of sensor technology and the richness of digital signatures from combined data are being aligned with all kinds of other data, up to and including DNA inspection resulting in highly sophisticated capabilities that can make a massive difference in terms of healthcare.  

Do most of these systems talk with each other and integrate well at this point?  

Don DeLoach: In smart homes, we’re already beginning to see more efforts around the combination of data. For instance, you may have smart door locks, a smart garage door opener, smart lighting, a smart thermostat and a Dropcam. One device may be from Apple HomeKit, another from Lowe’s, another from Honeywell, and they all have different standards. And they may or may not work together.  

So if you want a truly smart home, you actually need a smart hub where the various components are aware of each other and don’t operate in isolation of one another. Google bought the Nest Thermostats and shortly thereafter bought a company called Big Ass Fans with IoT-enabled ceiling fans. They basically allow the Nest thermostat to talk to the Big Ass Fans so that they could coordinate to achieve the right climate inside the house based on the owner’s desires. They subsequently bought Dropcam as well. That’s one example of a much broader move to combine various IoT assets to achieve a greater outcome.  

The AllSeen Alliance, spearheaded by Qualcomm, has really been working to try to normalize and converge the various protocols so that you do get operational substance.  

Are the majority of the activities or most use cases of combining IoT assets to achieve a greater outcome happening more in the consumer area or in the business environment? 

Don DeLoach: Last year Harvard Business Review published a really good article by Jim Heppelmann and Michael Porter that talked about the evolution from products to smart products to smart connected products to product systems to a system of systems. That evolution really speaks to the contextual nature of any given IoT subsystem relative to the other deployed subsystems that may or may not have any relationship to the system.  

An example is the OBD2 unit (onboard diagnostic unit), which gathers data in real time from all the sensors in a car. It was introduced in 1968 for detecting emissions. It has evolved significantly since then and now the units are in every car produced and gather all kinds of information from the sensors. They were used primarily by the auto manufacturers to both understand how the cars are running and use that information to make better cars and also to do predictive analytics for servicing of the cars. But the data is also used by third-party suppliers that want to understand how their specific parts are working.  

But if you take a step back, you recognize the same information from that unit can be repurposed to insurance companies for doing usage-based insurance. They care about what drivers are doing (speeding, how fast they brake, etc.) for the purpose of writing insurance. That’s an example of leveraging the underlying utility value of the data to gain much more value out of that data.  

It’s mathematically a certainty that we’ll begin to see homeowners insurance written the same way, based on a digital signature the underwriter has from smart home data evidencing how you live in your home.  

So it’s mostly happening for business processes and business decision making. 

Don DeLoach: This will exist in combination, and at an astounding level, way beyond homes and cars. Certainly it will make its way into health insurance with the digital signature picked up from wearables. The digital signature will be based not only on how you live your life but ultimately it will do genetic mapping of who you are and exactly what your genetic make-up is (this capability exists today) and then begin to evaluate your lifestyle in the context of your genetic make-up relative to the risk factors and appropriate considerations around health insurance.  

It will also make its way into a vast increase in the sophistication of predictions that can be made to people about their lifestyle in general with a much more granular view of those predictions.  

What about the cost factor for deployment of real-time systems and storage of data? For instance, is it affordable for SMBs?  

Don DeLoach: It is affordable for SMBs. The cost of sensor and communications technology is lowering, and the number of options in terms of these technologies are growing to the point where it is becoming easier and easier for small and midsize businesses to deploy IoT solutions. On the flip side, I would argue that the ROI associated with deploying these systems is moving aggressively to where we’ll reach the point where they can’t afford not to have them because of the visibility they’ll get into their systems or the insight they gain into how their operations run – and from a larger standpoint, the combination of these insights.  

Once a company has this real-time data, doesn’t it require that company have resources with skill sets to think differently about the information?  

Don DeLoach: The type of people and skill sets that work on real-time systems varies by use case. If I’m in a service provider’s operations center and I’m used to having information that comes at me at a slower pace and now all of a sudden I’m working on real-time systems, it probably takes a different type of person to know how to interpret and troubleshoot that information. So I think there are probably emerging demands in terms of skill sets.  

But where that situation exists, there are also market drivers for tool sets to aid that process. For example, visualization tools allow people to interpret and act on real-time information that might not be so obvious to someone who’s not used to understanding that signal.  

But once you start combining a lot of information, you may increase the need for real-time reaction because you might see a correlation between a number of different events that signals a need to take action in a smaller period of time.  

It goes to the whole signal-and-noise issue. For example, if I have a thousand different sensors in a building for temperature readings and I have nine different other subsystems, I may have 10,000 different sensors taking a reading every second in the building. That creates a huge amount of data. Some of that data may be anomalous and indicative of something that’s a problem and some may be only marginally anomalous and indicative of nothing other than maybe somebody walking into a room or something like that. Detecting and figuring out not only which signals are meaningful but which combination of signals is meaningful is very difficult for the average human being to detect at that level. That’s where the tool sets come in that are designed to allow people to begin to separate the signals from the noise. I think we’ll see more and more of these kinds of tools.  

One could argue that the demand for data scientists is growing dramatically with the influx of very large amounts of data and very large amounts of disparate, farther-reaching data. And I would agree with that but argue that this very scenario is creating market demand for tool sets that seek to make everyone at least in part a data scientist by making the distinction and understanding of that type of data something capable of the average business analyst.  

SandHill.com is proud to be a sponsor at the upcoming IoT Evolution Expo, January 25-28, in Ft. Lauderdale, Fla. Use the discount code of “SANDHILL” to get a 20 percent discount when you register for the event. The conference draws an international audience of IoT software companies, large enterprises, SMBS, network service providers, platform providers and device manufacturers.   

Don DeLoach is CEO and president of Infobright, provider of a purpose-built platform for storing and analyzing machine data. He has more than 30 years’ software experience. He served as CEO of Aleri, acquired by Sybase, and as president and CEO of YOUcentric, acquired by JD Edwards. Earlier he served in senior roles at Sybase and was director at Broadbeam and chairman of the board at Apropos. He is co-chairman of the ITA Internet of Things Council.  

Kathleen Goolsby is managing editor of SandHill.com.

 

 

 

Post Your Comment




Leave another comment.

In order to post a comment, you must complete the fields indicated above.

Post Your Comment Close

Thank you for your comment.

Thank you for submitting your comment, your opinion is an important part of SandHill.com

Your comment has been submitted for review and will be posted to this article as soon as it is approved.

Back to Article

Topics Related to this Article