You improve what you measure; you deter what you monitor. These aphorisms have guided and served corporations and public organizations fighting waste and fraud for many years. Yet, as some business processes are optimized and mastered, new information gaps inevitably open up as business becomes more real time and complex with the innovation required to meet the demands of customers and competition. In this article, we’ll take a broad view across industries on how modern information management can drive five key capabilities – transparency, consistency, integration, latency, dissemination (or democratized access) – that address root cause and help to prevent or minimize fraud and waste in a variety of situations.
Whenever information is scattered across multiple locations and systems and inhibits a complete understanding or is unavailable to different parts of an organization, it creates an “information gap.” This can lead to inefficiencies (and the resulting higher operational costs) and also fraud as the unscrupulous take advantage of the lack of a cohesive view and exploit the information gaps to their advantage. Examples include fraud in insurance claims, credit cards, Medicare and Medicaid, revenue leakage in telecommunications, retail rebates, and so on.
A general data management approach is necessary as the basic structure of a modern information management solution, and more specialized techniques may be needed to layer on top for unique situations.
This article examines a few industry examples of how organizations have adopted data virtualization to infuse data agility and new capabilities into their existing infrastructure. But let’s look first at the five key capabilities of a modern information management solution.
- Transparency – This is achieved when information can be seen in a holistic manner from source of origin to point of use. Transparency is often lost in traditional data integration, which is heavily dependent on data replication and intermediary data copies that disrupt continuity of data. Restoring transparency requires linking disparate information from source to consumer as linked data services while maintaining data lineage, governance and change-impact analysis.
- Consistency – This is needed for proper understanding of both data and metadata. The proliferation of software tools at every layer from data management, application processing and virtualization creates different labeling of metadata and applying different transformations to the data as it traverses an organization or process. This opens up the aforementioned “information gap.” Having an enterprise data abstraction layer that provides consistent, semantic, canonical views of business and data objects and their meaning – through normalized and documented transformations – serves to create the basis from which individual processes can source information for their needs.
- Integration – This capability brings together different facets or perspectives on the same business object, which is necessary for a holistic view. In other cases, it provides corroborative information from multiple sources about the same topic.
- Latency – Latency of real-time or near real-time access to information allows instantaneous action or even prevents the fraud or waste from happening in the first place. Often, the delay in getting the right information at the right time enables wastage. Thus, low latency can squeeze the gap and prevent that from happening.
- Dissemination – Disseminating information widely to multiple levels in the organization democratizes data access across multiple points of engagement. This empowers front-line employees with the freedom to improve efficiency or detect and counter fraud.
How data virtualization helps
The above capabilities are often missing in traditional data management, which is siloed and focused on batch-oriented replication. However, data virtualization adds data agility to existing infrastructures by performing the following functions:
- Creation of data abstraction and data access layer independent of underlying data sources
- Semantic integration of disparate structured and unstructured data to create canonical views of key business data entities
- Real- or near-real-time data access using federation, caching and/or selective batch movement (but without wholesale replication of the data as previous generation data integration)
- Delivering data services in various formats to various consumers with differentiated security, service levels and monitoring and allowing multiple data interaction paradigms such as search, browse, query, etc.
- Allowing discovery and governance of data and metadata, lineage, change impact and other data management capabilities through the virtual data layer.
Here are three of several organizations that have adopted data virtualization, which has helped them to tackle both fraud and waste in a variety of industries.
- Government. The Office of Inspector General (OIG) is an independent entity within the Housing and Urban Development (HUD) department of the U.S. government. The OIG’s mission is to detect and deter fraud, waste and abuse and promote the economy, efficiency and effectiveness of HUD operations. It accomplishes this by conducting and supervising audits, investigations and inspections relating to HED’s programs and operations. The OIG uses data virtualization to unify all its data sources while minimizing manual efforts and delivering timely reports that validate HUD’s real estate acquisition costs and operating expenses against statistical norms and market data. The data virtualization provides checks and balances to ensure the efficient operation of HUD and provides insight that can help prevent fraudulent activities.
- Manufacturing/retail. A global manufacturer and retailer of clothing fashions built its unique value proposition by having fast-moving new styles in stores every two weeks, thanks to its highly efficient design innovation. Yet, the fast turnover in goods also created excessive unsold inventory that had to be either discarded or sold in wholesale. To avoid this wastage and keep costs low, they adopted data virtualization for real-time virtual inventory management and sales reconciliation across hundreds of stores, regional warehouses, factory warehouses, in transit goods and planned production from suppliers. This gave them the ability to optimize the exact inventory of each style in stores to minimize unsold inventory and wastage/returns of clothing during every style rotation.
- Telecommunications. A telecommunications carrier paid rebates to cell phone and electronics retailers for every new device sold on the carrier’s network. Often, these sales can be reversed by the consumer legitimately, but there are also errors and intentional misrepresentations in rebates. Since the company used to rely on dated reports, it often overpaid for rebates. To minimize fraud and revenue leakage, the company developed a real-time system for tracking rebates based on data virtualization to aggregate information from multiple sources. Information is collected from retailer point-of-sale and transaction records, customer information systems, device provisioning systems and the financial rebate system and provides real-time checks across these systems to ensure that rebates paid are valid.
These examples illustrate how data virtualization can plug the “information gap” around transparency, consistency, integration, latency and dissemination that exists in traditional information management infrastructure and provide an almost immediate ROI.
According to a house committee report, the U.S. federal government alone is estimated to have lost $261 billion, or 7 percent of total spending, to fraud and waste in 2012. The numbers in commercial enterprises are also equally staggering. Data virtualization can address the fraud and waste challenge very effectively while providing broader data agility benefits to the rest of the enterprise.
Suresh Chandrasekaran, senior vice president at Denodo Technologies, is responsible for global strategy and growth initiatives in addition to operational leadership in other areas. Before Denodo, he served in executive roles as general manager and VP of product management and marketing at leading Web and enterprise software companies and as a management consultant at Booz Allen & Hamilton.