I was tinkering on blogging about Step 5 to React: Introduction to Redux, but I decided to write something closer to my heart: using ecs and terraform to deploy docker react apps (can I squeeze in anymore buzzwords?). This is not an intro to docker so I assume you are familiar with the basics.
My plan was to use docker to achieve continuous integration and ultimately continuous delivery and then deployment. But I like to start things small so the plan was to firstly create a docker image on every master merge and save that image.
This image is a stable and deployable package which can be deployed and run on any environment. This is the "golden" build concept - a single build that is used on all steps of the deployment pipeline: uat, stage and prod. This is a good practice to adopt because the exact same build that has gone through all the QA steps is the one that's going out to production.
It gives you confidence the prod deployment at the end will work as expected. This is only possible if the package that comes out of your CI build is immutable - that's where docker comes in. I don't know about you but I get really turned on by this kind of stuff! Let's get to it!
By the end of this blog, we want to be able to create a docker image containing our react app, be able to run lint, tests and the actual app on that image.
We will be using the codebase from my previous blog on react router. It's a minimal react spa with routing, you should be able to easily substitute your own codebase and follow the steps here to use docker.
I'm on a mac so download docker for mac from here. It's a 110mb download so stop reading and do it first, continue reading later. There are some hardware & os requirements. The important ones are:
If you have a previous install of docker-machine, mac docker will ask if you want to migrate data from that install to your new install. I said no because I don't have anything important to migrate as I'll be starting from scratch.
Once installation is done, open terminal and type:
You should get something like
Docker version 1.12.0, build 8eab29e
Now we are ready to rock!
We need to create a Dockerfile first. This is the sequence of instructions you tell docker to execute to create the image. It's akin to you manually entering a sequence of shell commands on the terminal of a new linux box when deploying your app. Except docker runs it for you automatically, and then saves the resultant state of that linux box as an image.
So right click on the root directory of your project, add a new file call it Dockerfile. It should look like this:
# We need a base image to build upon. Use the latest node image from # dockerhub as the base image so we get node and npm for free FROM node:latest MAINTAINER Yus Ng # Store all our app code in the /src folder, starting from package.json # first. Why copy package.json first? So we can take advantage of # the docker build cache. More below. COPY package.json /src/package.json # Once we have package.json, do npm install (restricting the loglevel # to minimise noise) RUN cd /src && npm install --loglevel error # Copy all our code (yes including package.json again) to /src. COPY . /src # Change directory into the /src folder so we can execute npm commands WORKDIR /src # This is the express port on which our app runs EXPOSE 3000 # This is the default command to execute when docker run is issued. Only # one CMD instruction is allowed per Dockerfile. CMD npm start
For more information on docker build cache check the official doco here.
We need one more file before we can build our image. Go ahead and add a new file to the root directory of your project call it .dockerignore
Docker will exclude files and directories specified here from the image. It should look like this:
.git .gitignore node_modules npm-debug.log
Let's do it! Go to terminal, cd into your root project folder where your Dockerfile resides and type the following (NOTE the "." at the end is very important!):
docker build -t reactjunkie:v1 .
Docker will build an image named "reactjunkie:v1" using the Dockerfile specified in the current directory (represented by the "." at the end). You can see it by issuing the command:
You should see two images; the latest node base image which gets downloaded when docker built our image and our reactjunkie:v1 image.
Now we have an image, we can start a container based on that image and run our app!
docker run -d -p 8080:3000 reactjunkie:v1
This command tells docker to run the default CMD command specified in the last line of our Dockerfile above. As we will see shortly we can override this default by issuing our own commands.
The -d flag tells docker to detach from the container process after issuing the command so we regain control of our terminal window.The -p flag maps the port on your mac (the host) to the container port. Hit [http://localhost:8080](http://localhost:8080) and you should be able to see the app running!
So the previous step demonstrated how we can run our webapp on our docker container. However in a CI environment, we want to be able to first build our image, run lint and tests and then save the image first prior to starting the web app.
I've setup eslint and jest in this project (available on github). To run eslint and tests on our container, type the following:
docker run -i --rm reactjunkie:v1 npm run lint docker run -i --rm reactjunkie:v1 npm t
The -i flag tells docker to run in "interactive" mode so we can see eslint console output from the container.
The --rm flag tells docker to automatically clean up the container and remove its file system when the it exits.
Then npm run lint and npm t are the commands that override the default CMD instruction in our Dockerfile. Docker will start a container based on our image, issue these commands and then cleanup and remove the container when that command is finished.
Now we have the docker image on our local machine, we need a way to export it to a central place so it can be shared with other developers, build systems and so on. Docker has dockerhub which does exactly that, but I use ECR which is aws' offering.
All the code in this blog are available on github
To be continued...