When people talk about the evolution of the internet, they might wonder what life was like “B.C.”— before cloud. The consensus? Unequal. In “B.C.” times (circa 1995–2006), startups had to be very thorough (and lucky) to anticipate demand for computing storage and server capacity. If demand were more than the server could handle, the server would crash; if demand were lower, wasted storage space would tie up valuable capital. Large organizations with enterprise datacenters had the edge—namely, the resources to manage server setup and scalability issues. Amazon, a hypergrowth startup in early 2000s, solved the problem by creating well-documented APIs that enabled internal teams to access common infrastructure services. In 2006, they leveled the field by offering the same service to third parties as the Amazon Elastic Compute cloud, thus ushering the era of cloud computing.
Businesses of all sizes jumped all over cloud computing, defined by Gartner as “scalable and elastic IT related capabilities, provided as a service to third parties over the internet.” With cloud, they not only had to invest much less capital and resources for a new project, they also had fewer operational problems compared to running in-house infrastructure. Beyond “Infrastructure as a Service” or IaaS (renting storage, computing, firewall/security, networking), cloud providers came up with offerings such as “Platform as a Service,” or PaaS (IaaS coupled with operating system, database management, development tools etc.), and “Software as a Service,” or SaaS (applications hosted over the internet). Currently, cloud computing is growing exponentially: The worldwide public cloud services market is estimated to be $246B in 2017, with the top 4 providers (Amazon 34%, Microsoft 11%, IBM 8%, and Google 5%) owning the majority of the market share, and is expected to reach $383B by 2020.
Yet, even with the cloud revolution, enterprise datacenters aren’t going the way of the dinosaurs just yet. Why? For three main reasons: data/IP security, migration complexity, and the emergence of hybrid clouds. In the rest of this blogpost, we examine each of these drivers.
The first major bump to 100% migration to public clouds is data and IP security, even though leading cloud providers have invested heavily in security for “data in transit” and “data at rest.” AWS has over 1800 security control services, including Amazon Macie, a machine-learning service to “automatically discover, classify, and protect sensitive data on the Amazon S3 cloud.” Google offers data loss prevention API for “fast, scalable classification and optional redaction for sensitive data elements like credit card numbers, names, social security numbers, passport numbers, U.S. and selected international driver’s license numbers, and phone numbers.” However, nothing is foolproof. The Pentagon recently learned this the hard way when 100GB of U.S. Army and NSA intelligence data was leaked on their public cloud. Organizations are understandably skittish of parking their critical systems, confidential data, and intellectual property with someone else in a remote datacenter. In-house, co-located enterprise datacenters offer comfort and security.
The second hurdle to cloud migration? It’s hard. Moving workloads from enterprise datacenters is costly and complicated, especially when the organization has legacy systems. Large organizations may also not feel it’s worth the loss of control. Jo Harder warns as much, using the analogy of buying vs. renting a condo. “…Much like living within the confines of a building owned by someone else, it may not be possible to host workloads in the same way as your own environment. While it is likely that the cloud offers new and better options for security, monitoring, and analytics, there may also be some unexpected limitations based on the inherent multi-tenancy associated with cloud. Further, some level of control is relinquished when renting a virtual cloud condo.”
The third reason is the development of new “hybrid cloud” configurations: on-site enterprise datacenters handling key applications and steady computing workloads, complimented by a public cloud (or clouds) for fast-growing or high-variability demand. This “best of both worlds” solution keeps existing datacenters and in-house expertise while also bringing the newest cloud technologies to in-house datacenters. Hybrid clouds are increasingly popular and particularly well suited for specific company or industry needs. ZDnet illustrates how hybrid clouds are a terrific fit for trading floors. “Pushing trade orders through the private cloud infrastructure and running analytics on trades from the public cloud infrastructure greatly decreases the amount of physical space needed for the latency-sensitive task of making trade orders. This is crucial for data security, as well. Threshold-defined trading algorithms are the entire business of many investment firms. Trusting this data to a public cloud provider is, to most firms, an unnecessary risk that could expose the entire underpinnings of their business.”
Cloud computing has been a tremendous achievement, enabling small companies to innovate and compete with large organizations. But “the new” doesn’t always displace “the old.” Enterprise datacenters still have a lot to offer and are not becoming extinct anytime soon. Co-existence with cloud technologies may be the next evolutionary step!