Skip to main content

3 Key Elements for Successfully Creating Value in Your Big Data Journey

By November 5, 2013Article

Big Data technology gives organizations unprecedented access to a large volume and variety of data that is otherwise not possible or considered to be too costly to pursue. While undoubtedly Big Data’s potential in value-creation opportunities is huge, business and IT leaders in large enterprises struggle with three aspects:

  1. How to size the Big Data opportunity within the context of their organization
  2. How to get Big Data technologies to work with cumulative years of investments in traditional business intelligence / data-warehouse technologies
  3. How to execute on the Big Data effort without falling into the trap of approaching it as yet another BI project

Here are three key elements of making the Big Data journey a success in an enterprise context. 

A. Establish a clear link between your enterprise business context and greatest potential of Big Data

Big Data areas of impact will depend on assessing factors of business context that includes your organization’s internal decision-making culture and areas that would be natural first candidates to start the journey. The following questions are key in assessing these context factors:

  1. How willing are your managers and leaders to accept insights to drive change in decision making that affects established practice?
  2. What are the volume, touch and quality of customer interactions dictating the customer experience?
  3. Are IT support costs of BI/data warehousing solutions not letting you invest into new projects and innovative solutions?
  4. Can a higher degree of personalization for specific customers and segments offer newer revenue streams?

As you assess the business context based on the above factors, a clear picture will emerge across one or more of the following four areas. This can serve as the starting point to undertake your enterprise Big Data journey.

1. Improve the precision of internal decision making. This will drive the focus of your Big Data effort on targeting an existing set of processes and procedures and providing insights/recommendations to improve the quality of decisions.

An example is improving the processes and procedures in an escalation tree for decision making affecting claims processing on one side or providing an added level of discount to a customer on another. Other examples: change a marketing campaign decision to extend or truncate the campaign, or reduce sales resource assignments covering one geography and increase in another. You may use Big Data insights to determine that you need to send a maintenance engineer to a customer site or require additional information from the customer to diagnose the problem. Or you may decide to increase follow-ups with customers with late payments based on insights derived from a broader set of customer events.

2. Improve the customer experience journey. The focus here of your Big Data effort is on improving the customer experience from understanding different touch points customers have with the enterprise and then driving improvement in quality and personalization of each touch.

Independent of the industry, all customers go through a journey of evaluating/researching products, buying products and deploying/using products in their environment. Current enterprise processes, tools and systems don’t unify these touch points across channels and stages of the customer journey.

Big Data allows leveraging internal and external data sources as well as structured and unstructured data about a customer to analyze and assemble a customer journey that becomes meaningful with each stage building on the previous one and driving the most optimal, personalized and rich experience for the customer.

3. Reduce the cost of IT support for the business intelligence/data warehouse environment. In this case, the focus is on reducing the overall cost of supporting BI and data warehousing (DW) tools that have become a huge investment area for most IT organizations. The area in which Big Data technologies immediately offers scale and cost saving is historical data and raw transactional and log-centric data.

Instead of maintaining these data sets in the traditional BI / DW environment and paying for performance, storage and compute, Big Data technologies can take over this data and provide effectively the same or better access to historical and raw data at a cost of an order of magnitude 5-10 times less than a traditional DW environment.

4. Look for monetization opportunities. Big Data can be applied within your organization by taking into consideration how customers consume your products/services and using those insights to identify opportunities to offer new revenue streams.

This specifically holds true if you can track customers’ unique consumption patterns and then create further refinement to product offerings as well as add a layer of value addition to make it specific to each customer/segment.

B. Assess the interplay of the newer Big Data technologies

Once you establish a clear link between enterprise business context and the Big Data potential for steps towards undertaking the Big Data journey, the second key element is to navigate through the IT investments in traditional BI / DW technologies and assess interplay of the newer Big Data technologies. This can be broken down into four buckets as typically known in the BI / DW world: (a) source data layer, (b) ETL or Extract, Transform and Load layer, (c) analytics layer and (d) visualization layer.

Big Data technologies can be made to work in such a way that each layer is built separately as its own parallel stack or they become integrated as part of the current BI / DW environment. For example, the source data layer and ETL layer for Big Data technology can be built out separately, but the analytics and visualization layers (especially for structured analytics) can be based on an existing set of tools and infrastructure in the current BI / DW environment.

The key guideline in making technology and architectural decisions is to understand the fundamental difference between the technologies. With Big Data technologies there is no formal definition that states clearly defined input on data sources, ETL rules or schema definition and visualization layer.

Rather, Big Data technologies are built for extensive scaling and data exploration against a large volume and variety of data and allowing new questions to be asked as the users learn more about the data and generate insights. These new questions could very well lead to a different set of attributes, schema and data sources.

Thus the best of both worlds would be to maintain separate source, ETL and analytics layers for the Big Data-driven technology stack. The visualization layer is where the coverage from a user point of view to enable structured analytics is visualized in traditional form and in new sets of insights as inline recommendations in the same workspace.

Big Data technologies are still immature, so another key aspect to remember is not to make all of these layers in the stack bind against specific vendor technology. Vendor technologies will evolve over the next two to three years including new vendors emerging. As new options become viable your organization may have a strong need to replace technologies with more robust solutions as part of your enterprise Big Data journey.

C. Recognize that the execution model for the Big Data effort is very different

The third element for driving success in your Big Data journey is to recognize that the model is very different and requires serious consideration and planning. First of all, not all the questions or needed analytics are known; thus the notion of having a well-defined set of requirements for the Big Data effort is a fallacy. Rather, it is a hypothesis-based approach that is to be proved or disapproved through iterative analytical modeling against a larger volume and variety of structured and unstructured data.

This calls for an Agile-based execution approach as well as a cohesive project team that includes both business and IT members under a single team structure.

From a skills perspective, the Big Data effort requires a combination of business domain expertise, strong analytical and statistical modeling skills and Big Data technology competency. Traditionally most organizations lack strong statistical modeling and analytical skills; and where they do have these skills, they sit in centralized planning functions and are not available at a certain scale for Big Data efforts. Your organization may decide it needs to add these skills across IT or business teams along with evolving Big Data technology competency. 

As insights become available from your Big Data effort, there needs to be an ability to deliver these as part of the integrated business process through a strong change management focus. Insights generated will only make a business impact when they are accepted and acted upon. Leaders and managers driving change is an essential part of the Big Data journey to effectively move to a data- and insights-driven culture.

The final aspect of the execution approach is removing data silos from an organizational data ownership perspective. The success of a Big Data effort requires maximum data access across enterprise data sources that include both structured and unstructured data even if it means not bringing it under a centralized BI / DW environment. The game-changer for organizations is Big Data’s promise of going after multiple varieties of data sets that seemed too costly or were discarded for the reasons of lack of structure to enable insight generation.

Organizations with a successful Big Data journey will derive benefits that are unprecedented in nature and will find themselves shaping customer experience, launching the next generation of products or services and evolving to a fact-based culture of decision making and increasing spend on business innovation.

In summary, the key elements of a successful Big Data journey lie in establishing a clear link of Big Data potential to the enterprise business context; understanding the interplay of traditional BI / DW technology and the newer Big Data technologies; instituting an execution model that builds on an Agile approach; unifying business domain, analytical and technology skills; and a change management emphasis for challenging the decision-making culture and tribal data ownership practices.

Sanjay Shitole, is a founding member and senior vice president at Trianz, a global management consulting and technology services firm. He leads the high-tech vertical and management consulting practice with a strong focus on helping high-tech clients deliver business impact in the areas of sales, channels, service and support operations through executing on business and technology initiatives. For more information, contact sanjay.shitole@trianz.com.

 

 

Copy link
Powered by Social Snap