As companies grow their data centers to accommodate more cloud services and applications, their resource management practices also grow increasingly complex. CoreOS is a new Linux distribution that uses containers to help manage these massive server deployments.
On May 19, CoreOS joined the Linux Foundation as a corporate member, along with Rackspace Hosting and Cumulus Networks. All three companies are playing a crucial role in the data center transformation and see open source as the lynchpin for optimal scalability, efficiencies, security and data center savings.
“In the server infrastructure space this shift has spawned new terms such as immutable infrastructure, launched new interest in Linux containers and Docker and seen the rise of new “behind the firewall” platform-as-a-service products,” said Brandon Philips, CTO at CoreOS, via email.
Here, Philips discusses how CoreOS uses Linux, why they joined the Linux Foundation, the role of Linux in current industry trends, and where computing with Linux is headed.
Linux.com: What does CoreOS do?
Brandon Philips: CoreOS enables buildout and management of massive server infrastructure. It combines a minimal OS built on the stable Linux Kernel with tools to run services across a cluster of machines. This is all packaged and ready to run on your physical gear or virtualized on platforms like KVM, EC2, GCE and many more.
How and why do you use Linux?
Linux dominates the server market and many companies are finding themselves building massive Linux server infrastructure to handle their businesses; while having to build the software and tooling in-house to manage the complexity. We builtCoreOS to address the needs of people running fleets of servers and have taken inspiration from the largest internet companies like Google and Facebook while building the products and tools.
We use Linux to support standard x86-64 server platforms and build on top of the Linux stable Kernel to provide a stable, modern and secure foundation. Using this solid foundation we build in tooling to manage updates in a safe manner, run services in Linux containers and distribute workloads across a cluster of machines.
Why did you join the Linux Foundation?
We joined The Linux Foundation because we believe in open collaboration on core technologies and using those technologies to create better, more secure Internet infrastructure. As participants in the development of Linux and regular attendees of LF conferences it was an easy decision.
What interesting or innovative trends in your industry are you witnessing and what role does Linux play in them?
Companies are building internal platforms for running their businesses and shifting their thinking from how to divide compute resources into smaller units and towards a model where they treat infrastructure as a big pool of resources. Instead of managing individual VMs, resources are available via platforms and APIs to teams without the need to manage the operating system directly.
In the server infrastructure space this shift has spawned new terms such as immutable infrastructure, launched new interest in Linux containers and Docker and seen the rise of new “behind the firewall” platform-as-a-service products.
What other future technologies or industries do you think Linux and open source will increasingly become important in and why?
Everywhere that you find software interacting with hardware Linux will continue to pop-up. My only hope is that as our cars, pockets, wrists and faces get more Linux devices that they will run the latest most secure versions of the Kernel and core software.
Read more about becoming a corporate member of the Linux Foundation.