Skip to main content

From Cloud Skeptical to Cloud Curious to Cloud First

By July 12, 2011Article

Whether you believe cloud computing has crossed the chasm or jumped the shark, we definitely seem to have reached a tipping point. (How’s that for cultural references?) While salesforce.com’s CEO Marc Benioff is now saying, “The Cloud is Passé,” for most people working in IT organizations today, cloud computing is very much the here and now. Over the past couple of years I’ve observed three stages of cloud enlightenment:

  • Cloud skeptical: Quick to jump on any outage or negative vendor news, they will either take a technology angle (“application X or platform Y will never move to the cloud”) or a business model angle (“the TCO over three years is actually higher compared to on-premise software”).
  • Cloud curious: Whether by force or by “levitational pull” (opposite of gravitational, right?), these are the tire kickers now realizing that the business groups initially drawn to software-as-a-service (SaaS) solutions might be on to something. Geoffrey Moore would likely call them the “late majority” or “laggards.”
  • Cloud first: Led by what some have called Government 2.0, this group now mandates that you justify “why not cloud?” for any new IT investment.

I recently sat down with Andrew Bartels, Director of IT at a US financial services firm, to learn about his path to cloud enlightenment as well as hear his advice for IT leaders seeking to gain a Cloud First perspective:

When did you first get involved with cloud computing?

Andrew Bartels: I guess you could say that I got involved in the cloud in stages. In the late 1990s and early 2000s I was working in an environment with AS400s. The backend ERP was still utilizing Green Screen emulations, meaning that users had to have a “Dumb Terminal” as well as a PC on their desk. My thought then was to replace them both with a ‘Thin Client,” which allowed for three Emulation sessions and an RDP Microsoft Published Desktop. This solution enabled them to utilize the AS400 keyboard, but have access to both Emulation and Windows applications. From my side of the house, this massively reduced the desktop administration footprint.

Cool. What happened next?

Andrew Bartels: In my next position I was tasked with choosing and deploying an ERP system into a branch-based organization. The corporate office was in Maryland with 25 service locations in six states. My assignment was to deploy an AT&T MPLS network while simultaneously building out an in-house data center at the corporate office north of Baltimore. This time I deployed CITRIX ICA and embedded Neoware devices onto each desk in each branch – all connecting back to a CITRIX Farm in the corporate data center. We deployed a standardized rack into each branch location, which contained a POE switch, VoIP switch and DMARC Router. There was no server.

Therefore, there were no security concerns and no local administration overhead. There was not even a local hard drive to fail or become corrupt. The local DHCP was handled by the DMARC Router. I referred to it as “Company name in a Box” with the idea being that you could simply send out a crew to wire a location, deploy the mini rack, plug in the pre-configured Thin Clients and you were up and running. This configuration enabled centralized communication and reduced service calls by our in-house techs so drastically that the company got rid of the two fleet vehicles that had previously been allocated to the IT Support Team. Effectively I had deployed a private cloud without using that terminology.

Solid ROI. Next?

Andrew Bartels: The next phase of the project truly introduced me to the power of the cloud. I had been looking for a solution to do scheduled confirmation calling without having to support additional hardware. I had toyed with the idea of having a custom Voice XML solution deployed using either Voxeo or Plumtree when I read a Wall Street Journal article that referred to a company that was working with the cable companies to revolutionize scheduling and optimization. That company was TOA Technologies. I found the company website and, after a phone call or two, found myself talking with their co-founder and CTO. We shared the same vision of how IT solutions should be deployed. Theirs was what was then referred to as a “hosted solution” (this was before the proliferation of the term “cloud” had entered the daily tech discussion forums).

Long story short, we had a team of developers build a custom bridge from our existing ERP to their hosted IVR to automate the process of making thousands of outbound interactive scheduling and post-service follow-up calls. Ultimately it worked wonderfully, but it took us three to six months of development to get there. The problem was not that their system or our system did not contain either the functionality or the data to get the job done. What was difficult was connecting the two together, over the Internet, in a reliable and cost-effective manner. It was at that time that I began to think: “If only there was a solution that I could easily deploy that could connect two disparate systems together across the Internet without all these developers being needed.”

Seeing how well this “hosted solution” worked really helped to solidify the idea in my head that businesses needed to be focused on process and providing solutions to their customers and not on running a data center. In the meantime, in addition to our data center in Baltimore, I had built out a disaster recovery site in a data center on the AT&T MPLS network under Broad Street in Philadelphia at a substantial cost. All the time I was thinking, “I know I have to do this but at the same time thinking I am never going to need this. If only I could build this out as a hosted, on-demand solution.”

What was your initial view of “the cloud?”

Andrew Bartels: My initial view of the cloud was from the view of using applications or toolsets in a hosted environment. My focus was on automated calling, and that’s how I initially saw the cloud. By this I mean taking advantage of the hosted model to outsource non-business critical functionality, which would allow in-house teams to focus on supporting infrastructure and applications core to the business. Effectively I saw myself as a “user” in relation to the applications.

How has this changed?

Andrew Bartels: In the past I wasn’t building and managing infrastructure in the cloud, but rather using applications to perform end-user tasks. With the evolution of solutions like Amazon Web Services (AWS), salesforce.com and Informatica Cloud, my thinking has evolved to the point where I believe corporations can build out their entire application infrastructure in the Public Cloud. With the deployment by AWS of their Virtual Private Cloud Solutions along with third-party vendor solutions like CohesiveFT’s VPN Cubed solution for AWS you can not only build out a private and secure private virtual infrastructure, but you can ensure security and deploy encryption between nodes within that infrastructure.

Combine these developments with cloud-based development platforms like Force.com and Heroku, linked together by secure cloud integration bridges and, from my perspective, you have a pretty good bird’s-eye view of the new IT infrastructure map, which corporations will be deploying over the next decade and beyond.

Is it safe to say you’re “Cloud First?” How do you determine when a cloud-based approach is right for you and your business?

Andrew Bartels: To be honest, I would really struggle to build a philosophical argument against a business integrating cloud-based solutions into their infrastructure. From my personal experience, the only obstacles to businesses moving their applications to the cloud right now are:

  • Cost: Currently when you run the numbers, if you don’t have the need for on-demand capacity, the ROI calculation for a service like Amazon Web Services just does not work. From a raw dollars-and-cents point of view, it is a tough sell given the alternative of deploying hardware in house. That said, this obstacle is being eliminated by the month. As more and more large vendors are entering the marketplace, I believe we will soon see the cost parameters driven down to a level where even small and medium-sized corporations will be able to justify the migration to the cloud.
  • Connectivity speed: Speed of connectivity to the cloud is critical, and I don’t just mean T1s. To be honest, my rule is the same as the RAM rule when it comes to memory – as much as you can afford and the system can support. It affects the speed of applications as much as it affects your ability to move data around. Corporations need to wrap their heads around this. T1s are not going to cut it anymore. If you have fiber anywhere near your locations, make sure you get it into your data center. Think in terms of 20/20 Mbps and up. Soon I believe the new T1 will be 100 Mbps.
  • Lack of familiarity on behalf of in-house IT teams: Lastly, it is critical that you and your in-house IT team glean a level of comfort with the new toolsets they will be required to master in your organization’s transition to the cloud. Without this level of confidence, you are going to see internal resistance because it is moving your team out of their comfort zone. Anyone can throw up a server in AWS. That’s not the learning curve. Understanding and gaining a level of comfort working within a cloud-based virtual infrastructure is where your teams need to gain confidence. My advice? Open an AWS account today and get your team playing around. Shut it down at the end of the day and restart it again in the morning. You will be amazed how little it will cost to get started, but the experience and confidence that your team will glean will be invaluable.

What advice do you have for other IT professionals seeking to develop a “Cloud First” mindset?

Andrew Bartels: Look for low-hanging fruit. You can begin experimenting with cloud-based, on-demand technology at a very low cost level. Cost should not be an obstacle to gaining confidence and experience. The first and biggest target for any organizations should be e-mail. From my perspective, no business should be running an in-house e-mail server without a compelling security or compliance reason restricting the migration.

One of the biggest challenges companies face today is ever-increasing file storage needs. This is another area of the cloud where the solutions out there are really beginning to mature. Do you really need to invest in an in-house SAN to store PST and Office files? There are a number of players out there that will allow you to extend your storage systems to the cloud. Think tiered storage. Do I need gigabyte connectivity to all my files or is Internet speed enough?

The next areas that any business should be focusing on are DR and backup. Are you still backing up to tape? Start researching cloud-based backup and data archiving solutions. Run the numbers. When you look at what Amazon S3 charges per gigabyte, you will be amazed how much you could potentially save in the long term. What would it take to build out your infrastructure in AWS and then create images of your servers, which you can store for pennies?

Finally, every time you reach a point where you have to replace or upgrade hardware, you should be asking yourself if there is a way that you can do this in the cloud. You might be surprised.

Darren Cunningham is VP of marketing for Informatica Cloud.

Copy link
Powered by Social Snap