Skip to main content

Cloud Strategy: To Disrupt – or Be Disrupted

By October 6, 2010Article

It wasn’t so long ago that Tim Berners-Lee enhanced the Internet with a few simple conventions that let documents be posted, linked to, and read remotely. The first phase of the Web was one of read-only information, and while businesses could adopt it as an additional communication medium, that advantage was spread around fairly evenly and disruption of established businesses was minimal.
The second phase of the Web was the distribution of information plus narrowly defined services. It was characterized by simple interactions, and it limited computing by the end user, which usually amounted to making a few entries in a form in a browser window. Some industry segments were disrupted, and a few new super-services, such as the search engine, music downloading, travel, and Webmail, emerged, buttressing the fortunes of a few companies.
A share of consumer retailing moved online. The book distribution/selling business was disrupted, and online auctions and trading platforms became a new medium of exchange. Middlemen, information gatekeepers of all sorts, were replaced by the interactive information delivery mechanisms of the Web.
Cloud computing represents a third and more disruptive phase of Internet computing. This phase consists of information plus broad services and products. It is characterized by deeper interactions powered by unlimited peer-to-peer-style computing, where each party may vary or build out the exchange. The meaning of the end user gaining programmatic control is that, in some instances, the interaction can go as far as both parties want it to rather than only as far as one party restricted it.
The nature of the interaction can change as it occurs. The data center can present the end user with new options geared to a particular individual that it seems to recognize. The end user can send back to the data center modified software that tells it where she wants to go. In phase three, a narrow service can be followed by another that was specifically requested by the end user, then by one that was co-built with the end user.
The cloud rolls up the changes of the first two phases and combines them with a powerful engine to do much more in this third phase. In this sense, the cloud computing phase is more likely to undermine established businesses than its predecessors. It promises to be a broadly disruptive technology wave, changing the way companies relate to their customers. As it matures over the next few years, unease with the term cloud computing will disappear, and this disruption will become known as the cloud revolution.
To disrupt or be disrupted
What generally happens to businesses when they are faced with disruptive change? A number of fascinating studies have captured the effects of disruption. In summary, they say that a new, low-cost way of doing things appears, based on new technology underpinnings. Initially, established businesses reject the change because it isn’t as well developed as the products they offer. It also lowers prices and margins and is unsuitable for their core customer base. The early adopters are not their customers anyway, and they represent low-profit prospects.
The change, however, is revised, improved upon, and built out by those who see value in it, and it is adopted by more and more customers. Established businesses see their customers starting to make a shift, so they rush into the new technology. But leadership in the segment has already been assumed by those who pioneered its development. Established businesses decline or fail in the face of this new competition.
A book that graphically captures the sequence and draws measured conclusions from it is “The Innovator’s Dilemma,” by Clayton M. Christensen (HarperBusiness, 2000). Established businesses have a hard time coping with disruptive change because their culture has set up processes and cultivated patterns of thought that serve their existing customers. To serve customers in an emerging market, it’s necessary to take a few people out of that culture, allow them to assess the change, and reward them based on their ability to exploit it.
In my simplistic synopsis, this sub-unit of the company will grow with the use of the disruptive technology and teach the parent organization, when the time comes, how to cope. Companies that do this, or some variation of it, such as acquiring a technology leader that’s at the heart of the change, may easily adapt to disruptive change. Then again, they may not. Think of Digital Equipment Corp. inventing the brilliant AltaVista search engine as a showcase for its Alpha servers, followed by the company’s demise not long after, and it’s hard to be optimistic.
How is cloud computing a disruptive change? The cloud certainly matches up with Christensen’s criterion of a low-cost alternative that is seized upon by emerging markets, but maintained at arms length by established companies. It’s often not immediately clear what is supposed to be done with disruptive technologies, since they don’t seem to serve powerful existing buyers, another apt descriptor of the cloud.
The cloud is disruptive in other ways as well. While a mainframe or a large Unix cluster was previously a difficult resource to access, the cloud makes great bursts of power — say, grabbing the services of 12 servers for two hours — relatively cheap. In the past, credentials as a researcher or a specialized business user were needed to access either enterprise or research center high-performance computing. The cloud makes it available to any taker who is willing to use a credit card. A researcher used to spend months or years learning a computer language and building a program that could execute against the data that his project possessed. With aids in the cloud to build programs that will run in the cloud, this process can be simplified, extended to more people, and speeded up.
A researcher will be able to count on the strength of the platform to provide some of the most complicated parts of the program, such as linking to a powerful database or moving data from storage to server memory caches at the right instant. The researcher won’t have to produce all the code to do this himself.
Microsoft and IBM are about to supply cloud frameworks based on their development tools; they will illustrate how the environment in which you develop aids deployment. Salesforce.com has been and remains a pioneer in this space, constantly expanding the tools that a customer can use to enlarge an existing Salesforce application or build a new one for the Force.com platform. Fewer and fewer specialized skills are needed to construct a cloud application. Software that exists in the cloud helps build software to be run there.
In addition, cloud development will extend programming skills to many new participants because the platform itself can invoke many automated steps in the process. The tools used will be simplified, and in some cases users will be given a checklist of choices, unfolding in a carefully planned sequence, that will allow nearly anyone to build simple software applications, then run them.
How will such a resource be used? Don’t assume that you know. Find a way to experiment with the customer. The new markets are discovered, not by focus groups of existing customers who may not even be familiar with a disruptive technology, but by those who use it directly. In many cases, there will even be a generation split as younger people take to the disruptive technology and implement it, while their elders hold back, satisfied that what they’ve got is good enough.
Smart people are already seizing on the possibilities of cloud computing and putting it to use in ways that many established businesses can’t foresee. Passionate individuals who suddenly realize that the cloud provides them with an avenue to do something that they’ve always wanted to do — research a problem, assemble a team, or produce a service—will find ways to do it in the cloud. Small companies with an instinct for what can be done and a knack for creating a profitable cloud service will find venture capital backing.
Tomorrow’s scenarios
As this is being written, Amazon’s Elastic Compute Cloud (EC2) is three years old and rapidly maturing. In the midst of a severe recession, not everybody is paying attention. Cutting costs has been the mentality that has dominated the landscape for the last two years and may continue to assert itself deep into 2010. Perhaps some companies will start to consider the possibilities of cloud computing as the economy revives. Meanwhile, those people who are left running computers and manning consoles at many companies have little time for experimenting or considering the implications of the cloud. For many, its impact won’t be realized until long after its early adopters have had a long head start.
Click here to read more from the new book, Management Strategies for the Cloud Revolution.
Charles Babcock is editor-at-large for InformationWeek. He joined ComputerWorld in 1984, where he served in the posts of New York correspondent and software editor. He became editor-in-chief of Digital News in 1988, then returned to ComputerWorld in 1992 as technical editor. He joined the Ziff Davis publication, Interactive Week, as technology editor, in 1997. He joined InformationWeek in 2003, where he continues to write as an editor-at-large on its staff. In 2002 he was one of three authors of “McBusted,” a Baseline magazine investigation into a failed $170 million supply chain automation project at McDonald’s. The three authors shared a Jesse H. Neal Award for investigative business reporting in 2004.

Copy link
Powered by Social Snap