I know I'm going to need to install/setup the following to get my Dockerized application working within Jenkins:
- The
aws
command line tool, to grab files from S3 (needed in my build for staging/production.env
files) - An instance of Docker private registry, which the build script uploads to after building a production Docker image
1. AWS command line tool
My AWS server already has permission to talk to the relevant S3 bucket through the magic of server profiles within AWS. All I need to do is install the aws
command line tool that my build script uses.
sudo apt-get install -y python python-pip
sudo pip install -U pip virtualenv
sudo su jenkins
cd ~/
virtualenv .venv
source .venv/bin/activate
pip install awscli
And that's it!
2. Docker Registry
I also create a Docker Private Registry instance to run within Docker on the Jenkins server. In the ? Shipping Docker series, I go into more detail on this, including setting it up to be backed by an S3 bucket so we don't lose our images if this container crashes.
# Start a container running Registry
sudo docker run -d -p 5000:5000 --name registry --restart unless-stopped registry:2
# Test it out:
curl localhost:5000/v2/_catalog
Then I just have Blue Ocean re-run the last job, and we can see that our jobs finish fully using the project's Jenkinsfile
.
3. Webhooks
A webhook should be going to our Jenkins server automatically, however Blue Ocean didn't set one up for me. In the video, we see how to set up a Github webhook to go to our Jenkins installation whenever we push to a repository. This is an organization webhook, not specific to any one project within it.
If you're interested in learning more about Docker and how I use Jenkins with a Docker workflow, check out the ? Shipping Docker series!