Hello, I'm Alan

Docker speedup tips

Friday, February 27 2015

There are two different parts of speeding up a container, the building and then the pushing of bytes across the network.

With building, Docker will cache commands as they are ran, so that only the line of a Dockerfile (and any subsequent lines) that have been modified will run.

COPY is also cached, so the COPY will only be ran again if the contents of the file change. Usually I will have two main sections to a Dockerfile, those that setup a container that rarely change (installing any requisite packages and such), and then any commands to build a version of a container for an application deployment

I also make sure to setup a .dockerignore, which to exclude the .git directory (because you shouldn’t need it in a deployment).

Ruby Speedups

First trick I tried is

COPY Gemfile /gems
COPY Gemfile.lock /gems
WORKDIR /gems
RUN bundle install --deployment --path /gems

ADD . /app

This way bundle install only gets ran if you change your gems, which is a great speedup, but if you have a lot of gems, this still takes a very long time to re-run from scratch.

I now run a script that installs gems in to a directory mounted from the local filesystem, and then COPY that directory in to my container being built.

GEM_PATH=$PWD/.gems
mkdir -p $GEM_PATH
cp -p Gemfile* $GEM_PATH
#chown because some things set 0600
docker run --rm -i -t -v $GEM_PATH:/gems -w /gems ruby:2.1.5 /bin/sh -c "bundle install --deployment --path /gems && chown -R `id -u` /gems/*"

I then just have a line in my Dockerfile to COPY the gems in

COPY .gems /gems
RUN bundle install --deployment --path /gems

The bundle install here just sets some files so my rails app runs correctly, it’s very very quick because it realises it already has all the files it expects

Try and keep the amount of things in your container small, and the amount of steps that change often small (so a deployment has a smaller diffset). Go delete those unused files!