NEWS AND RESOURCES

Why Containers? The Top 10 Benefits

Lee Hylton / March 23, 2020

Why Containers?

In a previous post, we contrasted the two prominent cloud deployment technologies – virtual machines and containers. The ability to virtualize server hardware kicked off the cloud revolution by making computing power just another utility available on demand. But containerization has taken cloud-based deployment to the next level. The trend in software development these days is to build applications to run on cloud platforms – “cloud native” applications, as it is called – from the start. And containers are the means to do so. Consider just one example of its uptake: Google says it runs everything – from Gmail to YouTube to Search – on containers, and spins up billions(!) of containers each week.

Containerization is a logical confluence of several trends – most notably, Agile– and DevOps– based software development, which favors an approach that modularizes applications into small, independent microservices (see our blog post on this). Microservices make it easy to add, modify or remove features without disturbing the whole application. And, as small, self-contained and portable units of code, containers form the natural deployment units for microservices. This allows different microservices to be scaled up, depending on demand, by spinning up more container images. Such containerized microservice-based workflows lead to the increased velocity of application development and deployment.

Where containers excel

To better see the advantages of containers, it is easiest to contrast it with the older cloud deployment technology – virtualization. As we explained in our earlier post, each virtual machine (VM) runs its own operating system (OS), cleverly partitioned by the hypervisor software to give the impression to each VM that it has exclusive access to the server hardware on which they run.

But therein lies a problem. A VM is heavyweight, because each comes with its own OS. Not only does the size of this OS often top several gigabytes, but it can take up all available RAM assigned to the VM –even if the applications inside that VM don’t need that much. Thus, if there is a surge in demand for an application, new VMs may have to be deployed. This takes time, as the VM has to be created and the OS and application code initialized. Moreover, each new VM takes up more space and memory, which may not all be used. This leads to poor server hardware utilization.

By contrast, a container is a self-contained piece of software which contains the application code (for a microservice) together with any dependent libraries and binaries needed for it to run. A container is built to run on a given OS – usually Linux – and thus does not need a separate OS instance for each new image. Also, the container runtime, which takes the place of the hypervisor in VMs, abstracts away any dependencies on the chosen OS, such as the difference between Linux distributions.

The Top 10 Benefits

It is now possible to see why containers have become so popular. We’ll spell out 10 reasons why:

Size: Containers are sized in megabytes, or less. Thus, one can spin up thousands of containers on a server without incurring any additional overhead for each instance. Consequently, one can grow the number of containers running on a server by a very large number before adding servers – a large savings in capital expense (CAPEX) and operating expense (OPEX).

Uniformity: Most development environments are built around a given OS – usually Linux – and its associated tools. As containers are written to work with a specific OS, you can build once for that OS environment.

Portability: The phrase “build once” must be paired with “run everywhere” to complete it. And a containerized microservice can be run on another Linux machine with minimal or no changes. That’s because a container carries all its dependencies with it, wherever it goes. Thus, a containerized microservice can be moved from a developer’s laptop running Ubuntu, to an on-premise server on SUSE Linux to a public cloud — with little or no friction.

Consistency: DevOps teams typically use a particular programming language (or a small set) with its associated tools and frameworks. As a container is self-contained piece of code, so long as it can run on a chosen OS, the team does not have to worry about different deployment environments and can concentrate building their specific microservice using their preferred language and tools.

Choice: A corollary to the portability aspect is that a container would run just as well on any cloud platform; so, the choice of the cloud provider can be made on the basis of business-dependent factors such as cost, geographical reach, etc. This freedom to choose is very important for many IT managers.

Elasticity: As demand for a microservice grows (or falls), the number of its containerized instances can be automatically grown (or reduced) with minimal overhead – a key advantage of using a cloud platform. It takes seconds to add or remove a container in contrast to minutes to spin up a VM.

Upgradeability: If a microservice needs to be replaced by a newer version or its container image is found to be faulty (maybe because a supporting library has a newly-discovered security flaw), these can be gracefully removed and replaced with the new version. And, if for some reason a container crashes, it does not affect the other containers on that server. Container orchestration tools such as Kubernetes – which we’ll cover in a separate post – makes this changeover easy.

Agility: Current Agile and DevOps based software development has greatly reduced the time between coding, testing and deployment – often called “continuous deployment.” Starting with containers as the unit of deployment right from the start makes these workflows uniform and frictionless, and many steps can be automated using a variety of tools.

Standardized: Google, Docker and other early proponents of containers open-sourced their technology under the governance of the Open Container Initiative (OCI). It has standardized container image formats and runtime to allow a compliant container to be portable across all major operating systems and platforms.

Training: Despite the trepidation with which cost-conscious CIOs confront new technologies, container technology does not impose any unusual burden on finding and training developers with the right skill set. Almost all developers know Linux, and containers are a built-in part of the Linux kernel. It has been around for a long time although its usefulness was realized only in the mid-1990s.

Takeaway

These ten items have hopefully motivated the growing trend for container-based development. On a cautionary note, not every application can be containerized, but many can. For those that can – and it is the growing trend to code new applications as containerized microservices from the start – containers provide the agility needed for today’s continuous build, integrate and deployment environment. Contact us today with any questions!