In today’s technology space, “it works on my system” is a very common phrase used by developers. But there are many cases where the application running on a developer system does not run on a tester’s machine or on a production machine, which leads to chaos. Docker was introduced to solve such problems. 💯

Why Docker?

Let me tell you about my personal experience of why Docker is so essential. I was working on a full-stack project with my team, where I had to create an application that used Angular in the frontend, Node.js in the backend, and MySQL as a database. We got stuck into several issues while working on this application using multiple technologies.

The first issue was the operating system compatibility. Since I was using multiple technologies, I had to make sure that all the technologies are compatible with the OS I am using. There were cases where the version of angular was not working with the OS version I was using, so I had to change the OS version multiple times. Since the architecture of the application, I was building changed over time, I got into situations where the software and libraries’ versions and dependencies had to be upgraded. I faced a lot of compatibility issues before arriving at a stable version of all the technologies that I was using for my application. Upgrading even a single tool used in my application was a risky business. Later, I had more developers working on my application. Since the technology stack was heavy, they had to run multiple commands for hours to set up their environment, which was compatible with the application. I had to check and evaluate every developer’s environment to make sure there is no mismatch from the technology version I was using. Finally, if any of the developers was not comfortable with the OS I was using to build my application, it was challenging for him/her to work on the application.

All of this made life difficult in developing, building, and shipping the application on time. This is where Docker came into the picture. It solves all the issues I just mentioned.

What is Docker?

Docker is a containerization platform that packages the application and its dependencies together inside a container so that the application works seamlessly in any environment, be it Development, Staging, or Production. It is a tool designed to make it easier to create, deploy, and run applications by using containers. Docker containers are lightweight alternatives to virtual machines, and it uses the host operating system. You do not have to pre-allocate any RAM in containers as you do in virtual machines. It is an open-source platform for developers, sysadmins, and enterprises to build, ship, and run distributed applications on the fly. To know more about how docker works internally, check out the Docker architecture.

Docker Images, Containers, Dockerfile

Docker Image

Docker image is a template needed to run an application. It consists of application code, libraries, tools, dependencies, etc. Docker images are read-only immutable files, which are sometimes also called snapshots. You do not start or run a docker image; you build a container through it. When a container is created from a Docker image, it creates a container layer on top of the image layer. There has to be a base docker image that is used to create multiple other docker images with some modifications to the base image. Docker Hub is a repository of Docker images for almost all technology stacks. You can pull a docker image from DockerHub and start building containers.

Docker Containers

Docker containers are executable software package that includes all dependencies required to execute an application. With Docker containers, applications can work efficiently in different computer environments. Below are the Docker Containers Features:

Lightweight Minimal overhead (CPU/IO/Network) Faster deployments Easily scalable Decrease storage consumption Portable, run it everywhere. Minimal base OS Application Isolation

Dockerfile

Dockerfile is a shell script that defines all the tasks that need to be executed. From a docker file, you create a docker image; from the docker image, you create a docker container.

Installing Docker

Docker can be easily installed on various Linux platforms, Windows, or macOS. Check out this post on How to Install Docker on Ubuntu, CentOS, Debian, and Windows. Also, Docker Desktop is available to Windows and Mac machines. It is an executable application that is very easy to install and helps to build and containerize applications on Windows and Mac environments.

Docker Editions and Pricing

There are two editions of Docker:

Community Edition (CE) Enterprise Edition (EE)

Docker community edition is open source and free to use. Docker CE aims at “do it yourself” approach where DevOps engineers can containerize their applications on their own. Docker enterprise edition comes with three versions – basic, standard, and advanced. The basic edition comes with a docker platform, support, and certification, whereas standard and advanced editions come with container management and docker security scanning features added to the basic edition features. Mostly techies use Docker for free. But if you need advanced features, you can use the paid version of Docker. The pricing depends on features such as repository management, CI/CD features, user management, developer tools, and support options. Below are the plan details of Docker: Now that you know the fundamentals of Docker let me show you a very simple docker example.

Docker Hello World Container

I am going to pull a hello-world docker image from DockerHub and build the image to create a docker container that will run the application. This image contains a simple application to print a hello message from Docker. Firstly, I need to pull the hello-world docker image on my Ubuntu machine. Check if the docker image got pulled. Run the command below to create a container and execute the application. Now, list all the docker containers. You can see the hello-world container ran 1 minute ago. Start exploring Docker; try out these fundamental docker commands.

Running Nginx Inside a Docker Container

Here, I will show you how to deploy Nginx inside a container. The command below looks for an Nginx image locally. Since there is no local image available, it pulls the Nginx docker image from the docker hub. It then creates a container with the name nginx_geekflare using the Nginx docker image and runs Nginx on port 80. When you list the docker images available on your system, you can see the Nginx Docker image was pulled. Run the command below to list all the containers running. You can see the container nginx_geekflare is running at 0.0.0.0:80 Go to your browser and open 0.0.0.0:80, you can see nginx is up and running inside a container.

Common Docker Use Cases

Configuration Simplification: Docker can run on any platform with the help of its configuration without the actual overhead of a virtual machine. It allows you to put the configuration file into the code and pass environment variables to cater to different environments. So that one docker image could be used in a different environment. Code Management: The code travels through a different environment in its journey from development to production. Each environment is having a slight variation than the other. Docker eliminates this difference by providing a consistent environment, making the development and coding so much more comfortable. Docker images being immutable, they come with the advantage of having zero change in application environment from dev to production. Improved Development Productivity: The two essential objectives in the development ecosystem, is to have the development environment replicate as close as possible to the production environment and next goal to get quality code delivered as soon as possible. Docker allows the code to run in a container that reflects the production environment, and unlike VM, docker has lesser overhead memory capacity wise, which helps several services to run. The other goal is achieved as we use the Docker’s shared volume for the application code to be available to the container from the host. This allows the developer to edit the source code from his platform and editor, which will reflect on the running environment inside the Docker. Isolation of Applications: There are cases where application isolation may be needed, for example, API servers that require different apache and a different set of dependencies. Running API servers under different containers is a much better way out. Debugging Capabilities: Docker provides numerous tools that work well with containers, with the ability to insert checkpoints within containers and also different containers, which are quite essential while testing applications. Rapid Deployment: Docker containers can be created quite quickly, which achieved as containers are not booting up an OS but just running the application. Once set, they give you peace that once the code has worked, it will work in all environments.

Conclusion

Docker, with its vast benefits, is becoming a valuable addition to the IT infrastructure. I hope the above gives you an idea about it. If you are interested in taking online courses, I would recommend taking Docker Mastery online course.

Understanding Docker for Beginners   the Container Technology - 24Understanding Docker for Beginners   the Container Technology - 73Understanding Docker for Beginners   the Container Technology - 58Understanding Docker for Beginners   the Container Technology - 49Understanding Docker for Beginners   the Container Technology - 50Understanding Docker for Beginners   the Container Technology - 33Understanding Docker for Beginners   the Container Technology - 86Understanding Docker for Beginners   the Container Technology - 60Understanding Docker for Beginners   the Container Technology - 47Understanding Docker for Beginners   the Container Technology - 99Understanding Docker for Beginners   the Container Technology - 91