Blog

Inside Docker – How and Why It Became so Popular

July 1, 2019

What is Docker? Why has it become so popular? The best way to get all the answers is to explain how it works.

Cargo ship on the ocean

If we go back 10 years, applications architecture was quite different. Back then, everybody was talking how heavy the monolithic system was, scaling always meant adding more memory, more disk space, more everything and that there was no solution in sight. The difficulties of adding new features to this system was a favorite topic of conversation between developers.


Along with new methodologies such as agile, the rocket-paced evolution of technology (mainly the internet) and the rise of a new architecture based on microservices, how we wanted to build, test, deploy and deliver software changed significantly.


Docker appeared as the best fit to handle all of this. Suddenly, it was being used by developers as a vehicle to distribute their software; by DevOps, mainly to manage the applications/services, isolating them to get better computer density, by enterprises for the CI/CD systems (continuous integration/continuous deliver) and to build their agile software and deliver new features faster and safer – regardless of the environment. Docker became massively accepted.


But what is Docker? Why did it get so popular? The best way to get all the answers is to explain how it works.


Docker performs operating-system-level virtualisation and runs software packages called containers. Containers are instances of images, which in turn are built from an instructions file called Dockerfile. All these concepts will be explained next.


Docker containers


If you visit Docker’s official page, you will find this excellent definition: “A container is a standard unit of software that packages up code and all its dependencies, so the application runs quickly and reliably from one computing environment to another.”


Another very good definition:


“Docker containers wrap a piece of software in a complete filesystem that contains everything needed to run: code, runtime, system tools, system libraries – anything that can be installed on a server. This guarantees that the software will always run the same, regardless of its environment.”


If we think about how we run software, probably on a server, and everything required after we built our software application, from VMs (virtual machines) and the Linux distribution that suits our software, to the setup of all the other components to run our code... it’s a lot to think about, right? We didn’t even mention hypervisors, etc! If you add to the mix everything we need to match without ruining compatibility between the different versions, it’s impossible not to feel overwhelmed. And we need to do this repeatedly.


Docker does the trick and simplifies all this process many times. Great news, right?


We went again to Docker’s website to find the perfect definition: “A Docker image is a file, comprised of multiple layers, used to execute code in a Docker container. An image is essentially built from the instructions for a complete and executable version of an application, which relies on the host OS kernel. When the Docker user runs an image, it becomes one or multiple instances of that container.”


How we dockerise (a new word for the industry to define Docker image creation) our system or app, for example, is by basically adding the instructions that Docker will read as a “recipe” to build it, as I mention before.


Docker also deals with repositories, as we do for our code, to store and distribute the Docker image and get that image available for everyone.


When we decide to build a system, a platform or even less complex applications, we no longer need to think about which programming language, which operating system flavour or application server based on the hardware/infrastructure we should use. Docker help us to use the correct choice for each case. Adopting a specific technologic stack isn’t a nightmare anymore. In the microservices world it’s usual to combine services implemented in different ways, Docker has improved the integration and communication between those services.


Containers in general, and Docker in particular, have also become popular due to the level of isolation they allow and how they are able to easily accommodate these services, allowing companies to take the best profit from the resources available, plus answering to increasingly demanding time to market and build demands.


Many systems are now very complex with a never-ending number of programming languages, tools, libraries, frameworks, platforms, etc. And all these artifacts were created with the goal of being adopted for everyone and somehow to provide ease of working with So, can we live with all these technologies at the same time? Yes, with Docker.


by Rui Nascimento

Senior Engineer

Tags
Docker