As the ecosystem of interconnected services and applications is accelerated by the rapid adoption of SaaS APIs and third-party integrators, IT teams are grappling with an expanding pool of challenges associated with data locality and shared services. As organizations work to navigate these new waters, however, there are past lessons that still apply in today’s climate that can help to alleviate pain and avoid potholes down the line.
There’s no doubt that the landscape of integration has changed over the past five years. SaaS and other technology providers have worked to publish APIs in order to ease integration between their offering and outside services and applications. The new landscape of REST-based services and well-defined APIs has enabled all kinds of service mash-ups that are rapidly generating business value. Marketing and sales teams are able to integrate Twitter, Facebook and other social media data streams into CRM systems, enabling more targeted lead-generation efforts. E-commerce companies are able to integrate back-end inventory management systems directly to Amazon marketplace to streamline virtual storefronts.
But how different is it really? In this article, we’ll examine some of the emerging challenges associated with this new API ecosystem as well as how lessons from the past can assist in developing a blueprint for a sustainable, scalable model for the API system.
Making connections, not chaos
As SaaS vendors continue to layer service offerings either on top of other services or attempt to integrate them, the ecosystem grows in complexity. As the number of services that are snapped together increases, there is an underlying question about the brittleness of these connections and whether they will prove flexible enough to scale to the expansion of an organization’s SaaS and on-premises applications portfolio.
The rise of RESTful APIs has encouraged many organizations to try to build their own data and application integrations, but as we in the data integration space have learned over the years, this may not prove a long-term viable solution.
Within the data integration space, we know from years of experience that point-to-point integrations can be brittle, especially as the number of connections increases. We also know that in terms of integration, in the “build versus buy” war, buy won out a long time ago. The complexity and CAPEX costs associated with developing and executing a long-term integration strategy simply proved too onerous for the majority of IT teams. While the emergence of RESTful APIs has led many to once again attempt to build out an independent integration strategy, we believe that the rate of SaaS and cloud adoption will force a return to an outsourced integration model.
Even selecting an integration provider should be approached cautiously, however. Today, there are a plethora of third-party service and API integration vendors trying to deliver integration efficiencies by abstracting the underpinnings of service enablement and lowering the effort required to weld together and maintain services. The problem is that the sheer volume of players entering the space and trying to make a mark by filling a specific niche is unsustainable long term. As the sector becomes increasingly crowded, a market shakeout looms ever closer on the horizon. Knowing that not all of these providers will survive long term leads to an interesting partnership and adoption dilemma for both the customers and technology providers trying to innovate and move forward.
Can’t we all just get along?
An important factor in sustaining a healthy yet competitive ecosystem is the transition from data ownership to data stewardship. Service providers need to understand that data is shared in order for services to work together. They need to actively work with end-user and partner organizations to develop a data stewardship approach by which the corporation’s data assets are managed in order to improve their reusability, accessibility and quality.
One area of concern regarding the ability to adopt a data stewardship approach is that, as large SaaS providers (such as Amazon and Salesforce.com) gain market share and play a central role in the development of API standards, they will shake outcompetitive third-party solutions, particularly those that succeed in garnering meaningful market share. Through sheer dominance, these large players will continue to maintain ownership of data, curbing IT departments’ ability to effectively manage data across the entire application landscape.
The opposite is also possible — a company’s ability to innovate may start to erode and stagnate out of fear of disrupting a maturing and highly dependent ecosystem. Quite simply, the desire to move forward may be slowed by the fear of revenue disruption as more services (and by extension, customers) become directly and indirectly dependent on the availability and predictability of a particular service.
As organizations within a maturing ecosystem begin to look ahead at sustainability of this model, transparency will need to be improved to prevent brittleness. A strong first step is to treat API consumers and third-party enablers — including API mashers and welders — as strategic partners, publishing and sharing road maps, enabling them to pivot or enhance their offerings to keep pace with the solution.
The next step would be eliciting feedback from other entities in the ecosystem and leveraging that input to influence the product road map. The goal of this information exchange is not to give away a vendor’s “secret sauce” or their future strategies, but to keep the full ecosystem growing and healthy. Done the right way, it can create tremendous efficiencies for players along the entire channel.
In the end, those companies that do not play well with others will likely create pain for their direct customers and lose market share as others jump on board with more open solutions. Although it may take time for the market to find the right balance, customer demand will force an equilibrium and greater interoperability over time. For vendors that plan to stay in the game for the long haul, now is the time to begin deciding how and where you want to fit in, as well as your game plan for working with others.
Whose data is it anyway?
Another thing to consider as we move more data into cloud behind these services is the need for improvements in information governance in order to prevent or mitigate the risk of a new form of vendor lock-in that I term “data lock-in.” Until recently, vendor lock-in was generally an issue of the value the application provider added to the data. For example, even if a technology provider ceased to exist, the customer could keep using the solution, at least until comprehensive upgrades became necessary.
While there are lessons to be learned from the issue of vendor lock-in, the concept gets a little more blurry with cloud-hosted services and solutions, and the issue of data lock-in. In the on-premises model, data locality is always known, and regardless of what happens to a technology provider, the data remains with the customer. In the new SaaS-oriented model, vendors can virtually take a customer’s data hostage or break the data chain should they fail due to service disruption. Some reasons for this include:
- Data and service availability may no longer necessarily be in the customer’s control or available at all times.
- SaaS vendors acting as stewards of a company’s data or service could disappear by forces beyond the end users’ control (gone belly-up, acquisition, change technology, a few missed payments, outages)
Developing a comprehensive information governance model is critical to preventing these issues. SaaS vendors need to shoulder some of this responsibility, but customers must also look for and demand detailed service level agreements (SLAs) that offer some protection. Organizations need to have a data plan to ensure that, should something happen with a service provider, the impact can be minimized. SaaS vendors that depend on other service providers need to do the same.
Be a good citizen of the SaaS ecosystem: document, version and road map
While organizations and their IT departments need to take a strategic approach to adopting and integrating SaaS and cloud-based solutions, SaaS providers need to actively strive to be good citizens of the technology ecosystem. Clear documentation of APIs and standards, offering good versioning methodologies (such as allowing some backwards compatibility and slow deprecation) and providing a public road map detailing the future of their services and APIs is crucial and will allow dependent users of the services and APIs the greatest chance at sustained success.
Careful planning and transparency will allow dependent consumers the ability to also plan and pivot as needed to reduce breakages and pain points. The result is a better end-user experience. Willingness on the part of SaaS vendors to responsibly share data, strategy and implementation details will go a long way in streamlining adoption and driving long-term innovation.
At the end of the day, SaaS vendors, as well as their partners and end-user customers, bear responsibility for how efficiently these third-party solutions will be adopted and implemented in the enterprise. Working together to identify the bottlenecks and brittle points, as well as where any individual SaaS vendor fits into the ecosystem, can mitigate significant pain points down the road. On the vendor side, publication of road maps — and monitoring of ones on which a service is dependent, can minimize the threat of disruption.
On the end-user side, developing a comprehensive information governance plan for shared data and services, and advocating for a data stewardship model, can help reduce data lock-in and improve integration of multiple on-premises and SaaS applications and data. As we’ve learned from years of experience in the enterprise technology sector, if parties across the SaaS spectrum, from providers to IT channel partners to end users, make working together a top priority, this model will continue to grow and thrive.
Robert Fox is the vice president of application development for Liaison Technologies and the architect for several of Liaison’s data integration solutions. Liaison Technologies is a global provider of cloud-based integration and data management services and solutions. Rob was an original contributor to the ebXML 1.0 specification, is the former chair of marketing and business development for ASC ANSI X12 and a co-founder and co-chair of the Connectivity Caucus.