Simplify Development with Docker

Ivan Montiel
4 min readSep 5, 2017

--

Starting a new project can be daunting; sharing the configuration for that project with your coworkers can be downright soul-crushing.

That’s why when I start new projects that need any sort of service or machine configuration, I use Docker and Docker Compose.

Before Docker

Time to start a new project. I immediately create a new README file in my favorite editor (it’s Atom, for now), and start writing down my dependencies. I already have nvm installed, so better write that down before I forget. And I already have Postgres installed, so let’s write that down. And I better write down the version of Postgres I have as well.

I have the environment set up, now it’s time to start writing code. I commit and push up my initial set of code. A coworker pulls it down, but it doesn’t work on his machine. He asks me of course, since I wrote it and checked in code that doesn’t work. All he forgot was that he needed to create a Postgres user for the database.

I decide that this is something that may happen to others, so I write a little bash script to create a user using psql. Satisfied, I continue writing new features. But now I have to change the initial schema I wrote. Being a good developer, I write a migration. I push up my code and continue with my day.

Now another coworker asks why she is getting Internal Server 500 errors. I forgot to let everyone know about the migration. I let her know directly and then tell everyone else that is working on the project to run the migration as well.

And this continues ad nauseam.

The Pain Points

That story might have been pretty long to get my point across, but it does point out a lot of the pain points that happen in a typical development environment:

  1. Developers have to keep meticulous track of dependencies. Especially for services.
  2. It’s very hard to communicate changes with the team.
  3. Special scripts have to be written (and developers have to remember to run them!)

This doesn’t even mention when the team goes to get the CI set up or deploy the environment to servers. Now the team has to make sure the server, CI, and local machine environments are all the same!

There Has to Be a Better Way

The main cause of the above pain points is having to share service installation instructions, libraries, and configuration between multiple people and systems.

Traditionally, the best way to handle this would be to create a virtual machine and share the VM with developers, the CI, and the servers. But updates to the VM are cumbersome, distribution is worse, and maintaining consistency between VMs is almost impossible without a tools like Vagrant and Ansible. And as the complexity of your system rises, the complexity of your VM(s) rises as well.

Just Use Docker

I set up Docker and Docker Compose early on in the new project set up process now. That’s not to say that old projects cannot benefit from Docker. It just may be hard to set up the project in Docker.

Docker is a container platform — it’s a way to package software “in a format that can run isolated on a shared operating system”. This let’s developers circumvent the dreaded “works on my machine” problem.

Containers are also lightweight. These self-contained systems are easy to share, reproduce, and change.

The best part about Docker is that each container is set up through a Dockerfile, which let’s every machine have the exact configuration and services that you need. Docker Compose lets you spin up many different machines along with your own Dockerfiles. This lets you create an entire infrastructure locally on your machine so you can develop with, or on you CI to run a full integration test.

An Example

Using Docker Compose makes it easy to work with other developers on a project. In this example, the project needs nginx, redis, postgres, a NodeJS server, and a React front end.

The README for this project basically boils down to:

  1. Install Docker
  2. Get your secrets set up in an untracked settings.json file or env vars
  3. Run ./launch.sh

The launch.sh file is just a shortcut for the following commands:

# Create some SSL certs since our front end uses WebRTC
bash ./create-ssl-certs.sh
# Build in case there were any changes to the Docker Compose or Dockerfiles
docker-compose build
# Run!
docker-compose up

The Dockerfile is just as simple, listing the services needed that Docker will pull down and some additional configuration for ports.

Wrapping Up

Docker makes the start up and on-boarding processes for projects much more straightforward and self-documenting. Developers don’t have to worry about communicating subtle changes to the architecture and can focus on more important work.

Docker also helps developers create consistent states across their machines. Changes to the infrastructure require changes to the Dockerfiles. If you need a new service, the best way to test it is to add it to the Docker Compose file. This makes it much more straightforward to handle complex systems that developers can work on.

Thanks for reading! If you liked this article, feel free to follow me on Twitter.

--

--

No responses yet