Docker’s impact on engineers’ efficiency

Docker’s impact on engineers’ efficiency

Docker

If you have attended any tech conferences or read industry publications in the last few years, you’ll know that these terms—‘docker’ and ‘container’—have seen growing buzz. While the words are often used interchangeably in the software engineering and IT space, they refer to two different things: Docker is an open-source platform used for the creation of software packages called containers. Docker use has surged in recent years. For the first time in its history, the 2019 annual Stack Overflow Developer Survey, included questions about Docker adoption and respondents named it the third most commonly used platform after Linux and Windows. Though the technology of containers is seeing growth, I find that the technology behind Docker is often misunderstood. In this blog, I will explain some of the common misconceptions about Docker containers, its benefits, and how Docker can be used with Kubernetes.

 

What is Docker used for?

 

Docker has become especially useful in the DevOps process as a way to automate some of the manual tasks of a DevOps Engineer. The DevOps role typically requires troubleshooting complex problems such as provisioning servers and configuring them to run software associated with your company’s tech stack. A DevOps Engineer may write one unique configuration file for use on a single server, but with Docker the engineer could write one configuration and use it across many instances, avoiding cumbersome and manual configuration. An example of Docker’s use in industry is music-streaming service, Spotify, which runs a microservices architecture with nearly 300 servers for every engineer on staff. This large number of microservices caused a strain on the team’s deployment pipeline. By implementing Docker, the teams could move a single container across their CI/CD pipeline. Developers could guarantee that the container that passed the build and test process was the same one used in production.

 

What are Docker containers?

 

Containers package everything required for the building and shipping of an application via something called “a container image.” This container image exists as the blueprint to execute the container and all of its determined operating systems, language libraries, environment variables, and more. Multiple containers can be created using the same container image. The container itself is a live computing environment while the image is a set of instructions for setting up the computing environment. You might think of the container image as a cake recipe to the container’s cake. Docker is the sandbox where developers can configure the very specific dependencies, operating systems, and libraries needed in a container for their applications and projects. The use of containers allows these configured packages to be used across computing devices. From your laptop to your coworker’s laptop to a cloud server, one container ensures they’re all running on the same operating system and programs.

 

Benefits of Docker for developers and IT

 

Across the board, the use of Docker and containers drive efficiencies for both development and IT teams. When developing applications on their local machines, developers need to ensure they have very specific versions of the software and tools required for their projects and Docker helps with this. Specifically, those benefits include:

 

·         Onboard teams fast: For engineers starting at a new company, it’s not unusual to spend your first few days on the job working with IT to upload new environments to your system. Docker helps get new team members up to speed quickly by easily replicating a full environment. Teammates can run a Docker configuration file on their system and within the same day start contributing to company projects. 

·         Straightforward, consistent collaboration: In addition to fast onboarding for new employees, Docker simplifies how engineers collaborate on projects without worrying about a common setup. Everything needed to work with a fellow developer or a DevOps engineer in a different office can be found in the container. 

·         Cost-efficient: In pre-Docker and container times, you’d typically deploy to a virtual machine (VM); one computer running one piece of software. Even if you wanted to run software that would never be executed, you’d still have to utilize space on a VM. However, with Docker, more VMs fit on a single instance. A team can run multiple software containers on a single VM, including those that will never be executed within the final product, which is a more budget-friendly solution for development teams to scale. 

·         Secure, fast deployment: Security threats are everywhere and one solution to mitigating the possible effects of running untrusted code from a third-party is to create a container and run code there. Say there was something harmful in that untrusted code, once I find it, I can just delete the container without having compromised my entire system. Because a container is an isolated environment, it won’t affect the rest of the computer since it doesn’t get persisted to the computer outside of the container. 

 

How Kubernetes is used with Docker 

 

A term often used in conjunction with Docker is Kubernetes, which is an open-source platform originally developed by Google for the management and orchestration of containers. The most widely used implementations of Kubernetes are Google Kubernetes Engine, running within Google’s Cloud Engine, and Red Hat’s OpenShift, popular for hybrid cloud uses.

 

Why use Kubernetes with Docker containers? 

 

·         Permits multiple Docker containers to work together—With small applications, there’s likely one container running on a server and nothing else; it’s simple to manage. With larger applications, many containers need to run correctly together. Docker by itself won’t solve the problem of getting different containers to work together. That’s where Kubernetes comes into play; it allows developers to run different containers that communicate with each other. 

·         Self-healing—Kubernetes has built-in features to help teams administer these many servers. If anything goes wrong when running a container in a Kubernetes environment—say, a bug in the code causes the entire server to crash—Kubernetes will automatically detect that and bring the full container back up online. 

·         Easy horizontal scaling—Kubernetes provides an easy solution for scaling an app.  Kubernetes can monitor the number of resources being used by a container. If the container uses too much RAM or CPU for some time, Kubernetes will automatically launch additional containers to handle the load. When those extra containers are no longer needed, Kubernetes will shut them down as well. 

 

SHARE AT

0 Comments

Leave a Reply