A Brief History of the Cloud

345

How we use computing infrastructure has changed drastically over the past two decades, moving from buying physical servers to having tools and technologies that make it easy for companies and individual developers to deploy software in the cloud. In his LinuxCon Europe keynote, Dan Kohn, Executive Director of the Cloud Native Computing Foundation (CNCF), provided us with a brief history of the cloud and how CNCF fits with where we are now.

Kohn starts with the year 2000, when you had to buy physical servers before you could deploy a new application, and if you needed more capacity, you bought more servers. In 2001, VMware “had the relatively brilliant idea of virtualizing that server so that each application could have it’s own virtual environment and you could have multiple different applications sharing the same physical server,” Kohn said. Moving on to 2006, Amazon popularized the idea that you could rent your servers by the hour, instead of buying them, and you don’t need to buy more capacity until you actually need it, which can save companies quite a bit of money. In 2009, Heroku made it easy for developers to deploy applications “without having to think of all the details about operating systems and versioning and keeping things up to date, and you didn’t necessarily need to hire the ops staff,” Kohn says.

Next, Kohn shifts from talking about proprietary technologies that shaped the history of the cloud and on to open source solutions, starting with OpenStack in 2010, which provides open source Infrastructure as a Service (IaaS) solutions based on VMs that compete with AWS and VMware. Cloud Foundry came along in 2011 to compete with Heroku to provide an open source Platform as a Service (PaaS) using containers. Jumping to 2013, Docker emerged to take technologies that have been around for years and combine them with better user interfaces and marketing, thus bringing containers to the masses. 

This brings us up to the present with the 2015 formation of the Cloud Native Computing Foundation (CNCF). Kohn says that “cloud native computing uses an open source software stack to segment applications into microservices, packaging each part into its own container and dynamically orchestrating those containers to optimize resource utilization.” The value propositions from cloud native computing include isolation, no lock-in, unlimited scalability, agility and maintainability, improved efficiency and resource utilization, and resiliency.

To learn more about how you can host your project at CNCF or get more involved in the project, you can visit the CNCF website and watch the video of Kohn’s entire keynote presentation.

 

Interested in speaking at Open Source Summit North America on September 11 – 13? Submit your proposal by May 6, 2017. Submit now>>

 

Not interested in speaking but want to attend? Linux.com readers can register now with the discount code, LINUXRD5, for 5% off the all-access attendee registration price. Register now to save over $300!