Leaders today know how difficult it can be to determine the return on investment (ROI) of a major big data project. Executives demand hard numbers; after all, that’s their job. But true innovation is often the biggest casualty of this kind of thinking.
Big data is an evolving field, and many of its greatest innovations have yet to be discovered. Enterprises would do well to understand this before beginning the process of implementing a big data infrastructure.
Ultimately, there’s only one way to ensure long-term success (and ROI) for any big data project. Rather than aiming for one “moonshot” win, businesses must develop an experimental mindset. That means focusing their efforts on nurturing smaller projects that lead to important innovations, then scaling up their star projects as needed.
Adopting an experimental mindset
Most engineers who work with big data recognize the value of an experimental mindset and approach; and in order to be successful, enterprises should, too. Big data is still on the cutting edge, which means that even its uses and applications aren’t yet fully understood. Knowing that, it’s especially difficult to justify a restricted, hyper-focused big data solution that leaves no room for innovation.
When discussing big data solutions, we often talk about the difference between “known-unknowns” and “unknown-unknowns.” With a known-unknown, common in existing technology, we know what the unknown is — we’re seeking an answer to a question, in other words, and the technology is designed to provide that answer.
With big data, we’re still in the realm of unknown-unknowns, where even the questions are evolving. That important distinction speaks to the powerful potential of big data but also to the need for an open, innovation-centered approach.
Aiming for several, smaller wins
As exciting and buzzworthy as big data is, consider that the majority of new big data projects are actually doomed to failed, according to estimates. Why? There are several reasons, but many boil down to the same reality: Many enterprises have decided to put all their eggs in one basket, banking on their belief that a big data project will bring them one major “win” that will ensure a return on their large investment in infrastructure, resources and team members. But if that big win never comes, the big data project is deemed a failure and shuttered permanently.
This kind of approach is mistaken. Corporations absolutely should expect to make money from their investment in big data, but they must also understand that it’s nearly impossible to measure the long-term ROI of an investment in infrastructure.
In fact, it is possible to calculate the ROI of individual projects. That’s important, because in the world of big data solutions, it makes more sense to aim at 20 $10 million innovations, for example, than at a single, nebulous $200 million goal. Nurturing smaller projects goes hand in hand with fostering an experimental approach, and it also reduces the risk of wasting your big data investment on a doomed concept.
Big data solutions are thrilling for enterprises, but they’re still in an early, evolutionary stage. Though it’s only natural for corporations to expect an ROI estimate for a significant investment in big data, it’s also critical that they understand why such estimates are so difficult to make.
By allowing for experimentation and innovation, enterprises can encourage teams to produce smaller wins that will nevertheless propel them toward future success.
Naresh Agarwal is head of data at Brillio. He has proven experience in growing IT services business portfolio, client relationship, sales and delivery management in complex delivery models. In his current role at Brillio, Naresh is in charge of driving insights from data and building teams across the board. While at Brillio, he has led multiple concurrent large complex data management programs within innovative solutions / technologies on budget and time. He has excellent team building and problem-solving capabilities.