Despite predictions about how many things will eventually connect to the Internet of Things (IoT), generating value from the data will not be contingent upon just connecting as many devices and sensors as possible. The value of the IoT lies instead in how organizations incorporate their data into processes they use to drive business forward. Central to effectively weaving IoT data into the operational fabric is determining, often instantaneously, what to do with certain portions of the data.
Organizations must analyze and act on some data immediately, and they must send other pieces of information to a central repository for storage and eventual analysis. Moving data, especially as the volume grows, introduces latency and higher costs.
2017 (and beyond) will swing the pendulum toward edge analytics. Rather than sending data to various places for analysis, we must instead focus on bringing analytics to the data at the edge of the network – where sensors, things and devices reside. With that future along the horizon, organizations should begin building from the ground up with edge analytics for the IoT in mind.
Most IoT deployments are not at the level we imagine when we think IoT. The GEs and weather companies of the world have thousands upon thousands of sensors in the field. But a majority of IoT projects are in the experimental stage and operate with far fewer devices and sensors just gauging the value of the data. This makes 2017 the perfect time to start investigating how to build for large-scale production.
A retailer, for instance, may be testing sensors monitoring foot traffic through its stores: where consumers linger, which aisles get the most traffic, which point-of-sale displays get the most attention, etc. The data show interesting anecdotal insights and will prove to be invaluable in managing merchandising and justifying promotional charges to suppliers. It is time to roll out these sensors to all stores and be able to provide real-time insights locally, regionally, nationally and internationally. However, if the sensors transfer all the data back to a central data depository, the analysis will be slow and may overlook many local insights.
Sending data from distributed locations to a centralized location for analytics and then back to the disparate locations for action takes time – too much time when market leaders like Amazon change their prices and inventory hourly. Instead, the retailer should build an analytics infrastructure in tandem with its added server locations at the edge of its network. Performing analytics at each of its distributed servers allows the company to ingest, analyze and take action on its data in real time, a necessity if it is to compete for market share and optimized store layout for maximum revenue.
In 2017, we will finally start to see edge analytics become mainstream. This trend will be driven by organizations’ need to analyze data instantaneously, despite operating in a geographically distributed manner.
To continue to derive value from their data, organizations must begin to build their edge analytics infrastructures now. This means they’ll need to consider tools that scale easily to multiple, disparate locations. It will require a mindset change, in which boardroom execs come to realize that moving data back and forth between remote locations is far too costly a practice. 2017 is the year edge analytics finds itself in the limelight for companies of all sizes in a wide array of industries.
Adam Wray, CEO and president of Basho Technologies, is a cloud enterprise technology entrepreneur and executive with more than 20 years of experience. Most recently, he served as CEO and president of Tier 3. Earlier, he held operational leadership roles at Amazon, Akamai Technologies and Limelight Networks. He currently sits on the board of directors for Basho Technologies and is a non-executive director of Cloudsoft Corporation, 6fusion USA and Observable Networks.