Skip to main content

Master Data Management Takes a Lead Role in Successful M&A Outcomes

By February 26, 2013Article

With an optimistic outlook for merger/acquisition (M&A) activity in 2013, dealmakers are looking for ways to expedite successful transaction outcomes and reduce risk. Data integration and master data management (MDM) should not be underestimated as an enabler for success at every juncture in the M&A process — from the preservation of business continuity to a driver for post-transaction innovation. A successful MDM strategy can even lower the barrier on future M&A activity.

Analysts, such as PricewaterhouseCoopers, predict that M&A activity will significantly grow in 2013. After a slowdown in the first half of 2012, strengthening balance sheets and continued access to capital foretell a jump in M&A activity in the year ahead.

Successful M&A outcomes depend on business agility now more than ever before. As companies’ IT portfolios continue to grow in complexity, the ability to reduce data friction and isolation rapidly has become of paramount importance to successfully integrate the acquired business unit(s).

This can put undue strain onto a company’s IT organization if not planned and executed effectively and efficiently. Merging, harmonizing and/or orchestrating disparate IT systems and their data into a single logical or even physical operating environment is neither a simple nor succinct task. How quickly and effectively an organization can integrate and manage critical business data can have significant bearing on post-M&A innovation and market competitiveness.

Despite what is at stake in successfully integrating the assets of an acquired company, many acquirers make a common mistake: viewing data integration as a one-time event. The mindset of treating an acquisition as a discreet occurrence has underhandedly weakened business intelligence across the enterprise, especially for those companies that rely heavily on growth-by-acquisition strategies.

Newly formed enterprises, as well as those companies in the midst of a merger or acquisition, must have access to more than data. They require access to intelligent business insight that will allow them to be agile as market, internal and IT opportunities and challenges arise.

That’s why successful data integration strategies must be process and integrity driven. Data must be elevated to a first-class citizen across the organization with careful attention paid to harmonizing, cleansing, consolidating and sharing of accurate data efficiently throughout the organization. Master Data Management and data management in general is not a one-time event but, rather, an ongoing process that constantly must be refined and reviewed to ensure successful business outcomes.

As new systems are introduced to the IT ecosystem, whether through suppliers, acquisition or deployment of new IT initiatives, companies must have the tools and processes in place to ensure accurate and reliable data to stitch together the enterprise effectively. Yet, despite the momentum behind increased volume and velocity of data, a significant portion of companies drop the ball precisely at this point.

Transforming risks and inability to compete to success through MDM 

This was certainly the case for one office supply company that had acquired dozens of companies over a two-decade period. A multi-billion-dollar organization, the company managed more than 150 distribution centers, 10,000+ suppliers and the fulfillment of nearly one million products. The complexity of these operations was exacerbated by over a dozen loosely integrated ERP systems and twice as many pricebook systems. To make matters worse, order fulfillment and supplier onboarding data was subjected to hundreds of inconsistent business rules and requirements from the company’s multiple business units.

Data transfer between systems was labor intensive and error prone. Manual data entry into multiple systems and over-complicated system synchronization between many overlapping, as well as siloed systems, was impacting the company’s ability to continue growth. In many cases, storing the “same” data in silos caused inaccurate data, processes and outcomes. The company was feeling the pressure to compete in their market despite their inability to add any more complexity. Business agility was lost with the “weight” of their data lacking a true data management solution.

This exposed preventable risks at every point in the supply chain — from customer fulfillment and supplier payment to forecasting. Most importantly, the company lacked a single record of truth (or “golden record”) of product and customer data. Data governance required to turn business data into intelligent insight was lacking.

The company realized that in order to preserve and grow market share it would have to improve how master data was being managed. They needed an accurate view of product selection and categorization, including which suppliers were tied to each product. This meant consistent categorization of similar products and reduction in miscellaneous items. The company also needed to eliminate duplicate items across various pricebooks. Finally, the company needed to reduce the number of (overlapping as well as siloed) systems to increase dexterity and insight as well as to reduce cost and complexity that was burdening the IT organization.

To do this, the company implemented a Master Data Management strategy in five distinct phases:

  1. Strengthen: The company strengthened its existing data governance program by establishing a data standards review board. This board developed, documented and enforced data management standards. Implementing a standardized set of requirements, a reworked data architecture was modeled, implemented and enforced.
  2. Create: The company created a Web-based item entry and maintenance solution that is the single point of entry for all supplier and product information. This solution now enforces data standards by cleansing and enriching data before it’s transferred to ERP and pricebook systems and is fed to their data warehouse through automated nightly feeds. Underneath is a common MDM infrastructure capable of validating, harmonizing and deduplicating the data into a consistent and reliable form. By creating a common point of access and entry, data validation and data quality techniques are more effective.
  3. Reduce and Eliminate: Over time, the organization was able to reduce the number of systems feeding in and consuming from the data management solution as it contained the master records for all product and customer information that was also stored disparately across many catalog, pricebook, and ERP systems. They reduced the number of system-specific user interfaces and back office applications from approximately 40 to only a few.
  4. Utilize: Previously, the company had no way to easily connect product/item data with supplier information. To resolve this issue, they were able to gain new insight and access to data before not possible by consuming information via the common interfaces to the data. This was achievable by creating and enhancing processes to consume and feed data into the common layer, which was made possible by implementing a data management infrastructure. 
  5. Normalize: Data normalization was viable once the systems were all connected to a governed MDM solution. Normalization is key to not only getting accurate and fresh data but also makes business analytics and business intelligence possible across an entire organization. By being able to analyze accurate data representing a common view of the organization, even when the data is across many systems, analytics and intelligence makes the data actionable. It is this last mile that can be the differentiator for a company’s ability to run a cost-effective agile business.

This five-phase approach has transformed the way the company views, accesses and leverages product and supplier data. For starters, the number of ERP systems has been reduced by more than 75 percent. Pricebook systems have been eliminated altogether. With a central point to cleanse and rationalize data as it flows between these systems, the company has improved data quality and reduced errors. Accurate, consistent and real-time information has given it the strategic insight needed to navigate an increasingly competitive marketplace.  

The bigger picture is in the fact that once a solid Master Data Management solution is implemented, future system integrations and M&A activities will have a much lower barrier in adding new systems into the company’s existing data architecture. This in and of itself increases a company’s agility and reduces the friction and drag that can occur at onset of business integration that usually follows M&A.  

Master Data Management can’t be an afterthought in today’s M&A environment. It has to be part of an overall integration process that is primed for agility and continuous improvement. Leaders that view data management as a process-driven driver for success will have a clear advantage at every turn — from initial data integrations to post-transaction innovation and growth.

Rob Fox is senior director of EAI/B2B Software Development for Liaison Technologies and the architect for several of Liaison’s data integration solutions. Liaison Technologies is a global provider of cloud-based integration and data management services and solutions. Rob was an original contributor to the ebXML 1.0 specification, is the former chair of marketing and business development for ASC ANSI X12, and a co-founder and co-chair of the Connectivity Caucus.