Why Won’t Enterprises Embrace Public Cloud Immediately?

Which Cloud Do I Use? Public or Private?
Which Cloud Do I Use? Public or Private?

Who’s providing Public Cloud services?

When we talk about “public cloud”, we’re talking about 3rd party services provided to enterprises outside the firewall. These public cloud services can fall into two categories:

  1. Virtual private cloud: Customers still leverage an off-site third party provider, but now they can create their own logically isolated pieces of the cloud – thus, virtually private. These are servers or partitions within servers that are isolated from other customers and can be completely controlled by the customers, who configure it as they see fit over a secure virtual private network.
  2. Public cloud: consists of dynamically provisioned and self-serviced compute resources over the Internet from an off-site (outside the “fire-wall”) third-party provider who shares resources with many enterprise customers. All enterprises SHARE the physical resources (e.g. no partitions).

So who’s playing in the “public cloud” space other than say Amazon, GoGrid, IBM/Cloudburst, Rackspace/Mosso, Terremark and SAVVIS? Here are a few others:

  • Google
  • FlexiScale
  • Skytap
  • StrataScale
  • SunGard
  • Layered
  • SliceHost
  • OpSource
  • ATT
  • Joyent
  • Cirrhus9
  • Verio
  • Force.com
  • 3Tera
  • Zimory

Oh, but wait a minute. If you want to classify the storage IaaS providers as 3rd parties who provide resources on demand over the Internet via a self-service…well, lets not forget the following…to name a few:

  • Nirvanix
  • 10Gen
  • Virsto
  • Zetta
  • Axcient
  • Ctera
  • Sonian
  • Egnyte
  • Mezeo
  • ParaScale

Interesting list found after the above list was compiled, here.

Why is Public Cloud an Issue?

In my last discussion with James Hamiton, the current architect of Amazon’s Web Services, Amazon doesn’t see as much reservation in using public cloud as the media is otherwise reporting. See James’ thoughts on his blog here.

Peter DeSantis, general manager of Amazon’s EC2 business, is focused on making the economics so desirable that companies will have to begin considering Amazon – albeit probably excluding mission critical and data sensitive applications at first.

My personal estimate of the 28,000 or so businesses currently deployed on AWS, a large portion of those are web/social networks, with a growing number of “data intensive applications” which require EC2 data processing power.  This leads to the number 1 concern in use of public cloud – security of my data.


There’s this tiny little issue of data security. IT leaders do not trust online vendors with the company data, including everything from customer information to legal documents to intellectual property and trade secrets. And, what about compliance? If the data isn’t handled properly and is in violation of Sarbox or HIPPA regulations, who’s responsible? Here are a few examples:

“We see this as a possible solution that needs more time to mature. In my opinion there remains too many questions regarding security [and] data ownership.” (Tom Gainer, CIO of FirstBank Southwest)

“We will wait and see how the security holds up before we will even think about it.” (Paul Vawter, CIO of Ohio Housing Finance Agency)

“It is much more difficult to ensure the security of our patients’ information when it is kept outside of the organization.” (David Van Geest, Director of IT for The Orsini Group)

“What tends to worry people [about cloud computing] are issues like security and privacy of data — that’s definitely what we often hear from our customers,” said Chris Willey, interim chief technology officer of Washington, D.C.

A discussion of security typically leads to regulatory issues that come into play when enterprise applications, and specifically enterprise application data, are moved into cloud-based services. Legislation and regulation in many industries has led to strict guidelines regarding enterprise data, primarily to protect the privacy of individuals. For example, as mentioned above, the Health Insurance Portability and Accountability Act (HIPAA) contains strict provisions on what can be done with individuals’ health records, and what permissions must be explicitly received from those individuals for any transfer of those records. For an enterprise application that contains health records, this presents an open question that must be answered before that application can integrate cloud services into its architecture: How do the regulations within HIPAA apply to cloud-based services, and what restrictions do they place, if any, upon the use of third-party cloud services within enterprise healthcare applications? The answer to this question, and similar questions for regulations in other industries, is evolving rapidly, but it will ultimately be answered by legislation and regulation rather than by technology.

Security, as well as privacy and compliance are major hurdles that cloud computing vendors still must overcome. Nevertheless, vendors are well aware of these hurdles and are working with governments, regulators, and standards agencies to develop services that are fully compliant.

Maybe for governments, large financial institutions, and other high-security environments, outsourcing the data center to a public cloud provider will probably never make sense. For virtually everyone else, it’s going to become a very attractive option in the next 3-5 years.

Privacy (The back side of the Security Coin)

Before turning to cloud computing applications to conduct business, enterprise executives are thinking twice about the potential for exposure of corporate secrets or legal liabilities, according to a new World Privacy Forum report.

Obviously, companies need to consider a providers’ terms of service, as well as the location and data restrictions on information put in the cloud. In some cases, providers have the right to read — and make public — information that is put in the cloud. Information stored in the cloud is much more accessible by a private litigant or the government. The location of the cloud provider is also an important consideration, Gellman said. If, for example, the cloud provider is located in the European Union, the data could be permanently subject to EU laws.

In March 2009, InfoWorld published an interview with Brad Templeton, chairman of the Electronic Frontier Foundation, in which he argued that the Fourth Amendment — unreasonable searches and seizures — might not apply to cloud computing.

“The courts have ruled that if you put information in the hands of third parties, even if only for a very specific purpose, you can lose that expectation. So the [U.S. Department of Justice] regularly acts to seize data in third-party hands without warrants — for example, from webmail providers — and this will surely expand to all sorts of cloud data.”

Availability / Reliability / Performance (SLA)

When we put our lifeline on the cloud, we just cannot live without it. What happens if it goes offline due to hardware or software failures or due to distributed denial of service (DDoS) attacks? Although each cloud provider claims their service is reliable, each one suffers service outages. For example, Google suffered a Gmail service breakdown in Europe. Do you have to build your own service by using multiple cloud providers?

Stress tests have revealed that the infrastructure-on-demand services offered by Amazon, Google and Microsoft suffer from regular performance and availability issues. These three platforms deliver wildly variable performance results, not to mention add and drop new features constantly. Response times on the service also varies by a large factor depending on the time of day the services are accessed.

Unplanned outages (temporary or major) are a reality of any cloud computing solution. The inevitable downtime requires organizations to develop strategies and backup plans for how their business needs will be met during times when their cloud applications, and the data stored and processed by those applications, will be unavailable. Understanding the cloud provider’s disaster recovery and business continuity measures, negotiating strong service level agreements and disaster recovery commitments, and implementing various other stop gap measures, such as off-line software synchronization, will help a company weather outages in its cloud computing solutions.

When it comes to just general availability (do we always have Internet access?), the future of cloud computing applications is not session-sensitive Web pages that deliver applications that are unavailable when there’s a hiccup in Internet access or loses a user’s form data when a backhoe accidentally cuts a fiber line.

Following the example of Google Gears, we going to have to see the next generation of serious Web applications develop an offline component in addition to the standard online component. This offline functionality stores the application locally and caches user data so that any hiccups to a Web session or connectivity outages allow users to continue to work uninterrupted. Then when Internet connectivity is restored, any work and changes made offline are simply synced up with the online version of the application.

Legal Jurisdiction

A major concern with cloud computing is the difficulty of determining where data will be stored, and, thus, what courts have jurisdiction and what law governs the use and treatment of such data (i.e., local, state, federal, foreign, etc.). Information sent or received by an organization or individual using a cloud computing service could be physically located in the United States or any other country in the world. How will a cloud computing customer address situations where one country’s reporting or discovery obligations conflict with the data privacy laws of another county? How will a cloud computing customer protect its intellectual property rights against infringement or other wrongful activity when its cloud-based applications are hosted in a country that does not recognize certain intellectual property protection measures? These and other potential conflicts between the various jurisdictions involved in a cloud computing solution should be resolved from the outset of the arrangement.

Licensing and Contractual Issues

Cloud computing requires agreements that provide for a licensing structure and contract terms that may not fit the enterprise interested in using it.

  • Since pricing in cloud computing is typically based on a pay-as-you-go approach, customers need to ensure adequate means for verifying their fee obligations and controls on fee increases.
  • Service levels are key to ensuring that a customer has the needed level of accessibility to its cloud-based IT environment.
  • A customer should carefully consider how it will transition from one cloud provider to another, or away from a cloud computing environment, and what contractual obligations it would need in place to ensure that such a transition occurs smoothly.
  • Any cloud computing agreement should document a comprehensive understanding of each party’s intellectual property rights in the solution, the information stored, the hosted applications, and all developments that result out of the cloud computing arrangement.

I don’t know about you, but I’ve never done a deal with a Fortune 1000 that resulted in a “standard legal template”.


“Will Cloud Computing platforms allow me to reuse my organisation’s existing development and architecture skills?”: this is a fair and valid question to ask of any new technology model. The answer is “It depends”. Different platforms have been built to deliver different balances of value. If you decide to work with a platform such as Salesforce’s Force.com, then your developers will have to learn Salesforce’s proprietary Apex language and familiarise themselves with the extensive built-in code libraries. If you decide to work with a platform such as Amazon’s AWS, then your developers may not have to learn any new techniques. However your architects will have a tougher time working with AWS, because architecture concerns such as component communication, application integration and application security are much less well-catered for “out of the box” whereas the Force.com platform takes care of them for you. It’s important to counterbalance any nervousness you may have about development or architectural skills requirements with the benefits you may get related to administration skills and resources. Whatever type of Cloud Computing platform you choose, you’ll be able to significantly reduce or the need to use systems administrators.


If you decide that a given Cloud Computing provider has the requisite security infrastructure in place to allow you to entrust it with your business information, then the next natural question is: “Can I integrate remote systems with others (either remote or on-premise) in a way that’s reliable and fast enough?”


The most serious question that some will ask themselves about placing computing workloads on Cloud Computing platforms is whether, by doing so, you limit your freedom to select other providers or technology models in the future. Is Cloud Computing a “roach motel”, like the many proprietary platforms that proliferated in the early days of the mainframe and client-server computing eras? Just as with the other potential risks outlined above, it pays to be specific when thinking about lock-in risk. The answer here is really to ignore the “Cloudiness” of the platforms you’re considering using, and just think about them as development and runtime platforms like any others you might be using in your organisation today (such as .NET/Windows Server, J2EE, COBOL/CICS, PHP, and so on). Examine the specific applications and workloads that you’re considering deploying to a Cloud Computing platform and ask yourself to what degree any potential lock-in will impact your ability to deliver business value in the short term and in the long term. Don’t forget to weigh any risk of potential lock-in against benefits (such as productivity benefits) that may be features of the platform(s) you’re considering. What’s more important to the business?

Keeping things in-house (Private Cloud)

Instead of a public cloud, a government, for example, can build a private cloud and deliver services to agencies in-house. A custom-built private cloud might be worth considering for those with privacy and data ownership issues, and may be an ideal option for tasks that the public model can’t deliver yet.

So think about this before dabbling in Public Cloud….because once you put your information in the hands of any hosted provider, can you truly ever take it back? Let’s say Google decides to stop delivering Google Docs, for example. What’s the solution? What’s going to happen to all my documents? Let’s say a cloud provider suddenly goes out of business. What then? The cloud disappears. They’re gone. They’re bankrupt. What happened to my data?

Jim Kaskade

Jim Kaskade is a serial entrepreneur & enterprise software executive of over 36 years. He is the CEO of Conversica, a leader in Augmented Workforce solutions that help clients attract, acquire, and grow end-customers. He most recently successfully exited a PE-backed SaaS company, Janrain, in the digital identity security space. Prior to identity, he led a digital application business of over 7,000 people ($1B). Prior to that he led a big data & analytics business of over 1,000 ($250M). He was the CEO of a Big Data Cloud company ($50M); was an EIR at PARC (the Bell Labs of Silicon Valley) which resulted in a spinout of an AML AI company; led two separate private cloud software startups; founded of one of the most advanced digital video SaaS companies delivering online and wireless solutions to over 10,000 enterprises; and was involved with three semiconductor startups (two of which he founded, one of which he sold). He started his career engineering massively parallel processing datacenter applications. Jim has an Electrical and Computer Science Engineering degree from University of California, Santa Barbara, with an emphasis in semiconductor design and computer science; and an MBA from the University of San Diego with an emphasis in entrepreneurship and finance.