In the previous chapter, we successfully deployed the Django application on an EC2 instance. However, most of the deployment is done manually, and we don’t check for regression when pushing a new version of the application. Interestingly, all the deploying can be automated using GitHub Actions.
In this chapter, we will use GitHub Actions to automatically deploy on an AWS EC2 instance so that you don’t have to do it manually. We will explore how to write a configuration file that will run tests on the code to avoid regressions, and finally connect via Secure Socket Shell (SSH) to a server and execute the script to pull and build the recent version of the code and up the container. To recapitulate, we will cover the following topics:
The code for this chapter can be found at https://github.com/PacktPublishing/Full-stack-Django-and-React/tree/chap14. If you are using a Windows machine, ensure that you have the OpenSSH client installed on your machine as we will generate SSH key pairs.
Before going deeper into GitHub Actions, we must understand the terms CI and CD. In this section, we will understand each term and explain the differences.
CI is a practice of automating the integration of code changes from multiple collaborators into a single project. It also concerns the ability to reliably release changes made to an application at any time. Without CI, we should have to manually coordinate the deployment, the integration of changes into an application, and security and regression checks.
Here’s a typical CI workflow:
You can find many tools for CI pipeline configurations. You have tools such as GitHub Actions, Semaphore, Travis CI, and a lot more. In this book, we will use GitHub Actions to build the CI pipeline, and if the CI pipeline passes, we can deploy it on AWS. Let’s now learn more about CD.
CD is related to CI but most of the time represents the next step after a successful CI pipeline passes. The quality of the CI pipeline (builds and tests) will determine the quality of the releases made. With CD, the software is automatically deployed to a staging or production environment once it passes the CI step.
An example of a CD pipeline could look like this:
GitHub Actions and the other tools mentioned for CI also support CD. With a better understanding of CI and CD, let’s define the workflow that we will configure for the backend.
Important note
You will also hear about continuous delivery if you are diving deeper into CI/CD; it is a further extension of continuous deployment. Continuous deployment focuses on the deployment of the servers while continuous delivery focuses on the release and release strategy.
Before deploying an application as we did in the previous chapter, we need to write off the steps we will follow, along with the tools needed for the deployment. In this chapter, we will automate the deployment of the backend on AWS. Basically, each time we have a push made on the main branch of the repository, the code should be updated on the server and the containers should be updated and restarted.
Again, let’s define the flow, as follows:
The following diagram illustrates a typical CI/CD workflow:
Figure 14.1 – CI/CD workflow
That is a lot of things to do manually, and thankfully, GitHub provides an interesting feature called GitHub Actions. Now that we have a better idea about the deployment strategy, let’s explore this feature more.
GitHub Actions is a service built and developed by GitHub for automating builds, testing, and deployment pipelines. Using GitHub Actions, we can easily implement the CI/CD workflow shown in Figure 14.1. Before continuing, make sure that your project is hosted on GitHub.
GitHub Actions configurations are made in a file that must be stored in a dedicated directory in the repository called .github/workflows. For a better workflow, we will also use GitHub secrets to store deployment information such as the IP address of the server, the SSH passphrase, and the server username. Let’s start by understanding how to write a GitHub Actions workflow file.
Workflow files are stored in a dedicated directory called .github/workflows. The syntax used for these files is YAML syntax, hence workflow files have the .yml extension.
Let’s dive deeper into the syntax of a workflow file:
name: Name of the Workflow
on: push
jobs:
build-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Listing files in a directory
run: ls -a
In our GitHub Actions workflow, we will have two jobs:
The deployment of the application will depend on the failure or success of the build-test job. It’s a good way to prevent code from failing and crashing in the production environment. Now that we understand the GitHub Actions workflow, YAML syntax, and the jobs we want to write for our workflow, let’s write the GitHub Actions file and configure the server for automatic deployment.
In the previous sections, we discussed more about the syntax of a GitHub Actions file and the jobs we must write to add CI and CD to the Django application. Let’s write the GitHub Action file and configure the backend for automatic deployment.
At the root of the project, create a directory called .github, and inside this directory create another directory called workflows. Inside the workflows directory, create a file called ci-cd.yml. This file will contain the YAML configuration for the GitHub action. Let’s start by defining the name and the events that will trigger the running of the workflow:
.github/workflows/ci-cd.yml
name: Build, Test and Deploy Postagram on: push: branches: [ main ]
The workflow will run every time there is a push on the main branch. Let’s go on to write a build-test job. For this job, we will follow three steps:
Let’s get started with the steps:
.github/workflows/ci-cd.yml
build-test: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Injecting env vars run: | echo "SECRET_KEY=test_foo DATABASE_NAME=test_coredb DATABASE_USER=test_core DATABASE_PASSWORD=12345678 DATABASE_HOST=test_postagram_db DATABASE_PORT=5432 POSTGRES_USER=test_core POSTGRES_PASSWORD=12345678 POSTGRES_DB=test_coredb ENV=TESTING DJANGO_ALLOWED_HOSTS=127.0.0.1,localhost " >> .env
The tests will probably fail because we haven’t defined the Github Secret called TEST_SECRETS.
Figure 14.2 – Testing Github secrets
.github/workflows/ci-cd.yml
- name: Building containers run: | docker-compose up -d --build
.github/workflows/ci-cd.yml
- name: Running Tests run: | docker-compose exec -T api pytest
Great! We have the first job of the workflow fully written.
git push
Figure 14.3 – Running GitHub Actions
Figure 14.4 – Successful GitHub Action job
Great! We have the build-test job running successfully, which means that our code can be deployed in a production environment. Before writing the deploy job, let’s configure the server first for automatic deployment.
It’s time to go back to the EC2 instance and make some configurations to ease the automatic deployment. Here’s the list of tasks to do so that GitHub Actions can automatically handle the deployment for us:
This looks like a lot of steps, but here’s the good thing: you just need to do that once. Let’s start by generating SSH credentials.
The best practice for generating SSH keys is to generate the keys on the local machine and not the remote machine. In the next lines, we will use terminal commands. If you are working on a Windows machine, make sure you have the OpenSSH client installed. The following commands are executed on a Linux machine. Let’s get started with the steps:
ssh-keygen -t rsa -b 4096 -C "[email protected]"
Figure 14.5 – Generating SSH keys
cat .ssh/postagramapi.pub | ssh username@hostname_or_ipaddress 'cat >> .ssh/authorized_keys'
Figure 14.6 – Registering the private key into GitHub Secrets
You also need to do the same for the passphrase, EC2 server IP address, and username for the OS of the EC2 machine:
Figure 14.7 – Repository secrets
Great! We have the secrets configured on the repository; we can now write the deploy job on the GitHub action.
The benefit of using GitHub Actions is that you can already find preconfigured GitHub Actions on GitHub Marketplace and just use them instead of reinventing the wheel. For the deployment, we will use the ssh-action GitHub action, which is developed to allow developers to execute remote commands via SSH. This perfectly fits our needs.
Let’s write the deploy job inside our GitHub action workflow and write a deployment script on the EC2 instance:
.github/workflows/ci-cd.yml
deploy: name: Deploying on EC2 via SSH if: ${{ github.event_name == 'push' }} needs: [build-test] runs-on: ubuntu-latest steps: - name: Deploying Application on EC2 uses: appleboy/ssh-action@master with: host: ${{ secrets.SSH_EC2_IP }} username: ${{ secrets.SSH_EC2_USER }} key: ${{ secrets.SSH_PRIVATE_KEY }} passphrase: ${{ secrets.SSH_PASSPHRASE }} script: | cd ~/.scripts ./docker-ec2-deploy.sh
The script run on the EC2 instance is the execution of a file called docker-ec2-deploy.sh. This file will contain Bash code for pulling code from the GitHub repository and building the containers.
Let’s connect to the EC2 instance and add the docker-ec2-deploy.sh code.
#!/usr/bin/env bash
TARGET='main'
cd ~/api || exit
ACTION_COLOR=' 33[1;90m'
NO_COLOR=' 33[0m'
echo -e ${ACTION_COLOR} Checking if we are on the target branch
BRANCH=$(git rev-parse --abbrev-ref HEAD)
if [ "$BRANCH" != ${TARGET} ]
then
exit 0
fi
# Checking if the repository is up to date.
git fetch
HEAD_HASH=$(git rev-parse HEAD)
UPSTREAM_HASH=$(git rev-parse ${TARGET}@{upstream})
if [ "$HEAD_HASH" == "$UPSTREAM_HASH" ]
then
echo -e "${FINISHED}"The current branch is up to date with origin/${TARGET}."${NO_COLOR}"
exit 0
fi
Once this is done, we will then check the repository is up to date by comparing the HEAD hash and the UPSTREAM hash. If they are the same, then the repository is up to date.
# If there are new changes, we pull these changes. git pull origin main; # We can now build and start the containers docker compose up -d --build exit 0;
Great! We can now give execution permission to the script:
chmod +x docker-ec2-deploy.sh
And we are done. You can push the changes made on the GitHub workflow and the automatic deployment job will start.
Important note
Depending on the type of repository (private or public), you might need to enter your GitHub credentials on every remote git command executed such as git push or git pull for example. Ensure you have your credentials configured using SSH or HTTPS. You can check how to do it https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/creating-a-personal-access-token
Ensure to have a .env file at the root of the project in the AWS server. Here is an example of a .env file you can use for deployment. Don’t forget to change the values of database credentials or secret keys:
SECRET_KEY=foo DATABASE_NAME=coredb DATABASE_USER=core DATABASE_PASSWORD=wCh29&HE&T83 DATABASE_HOST=localhost DATABASE_PORT=5432 POSTGRES_USER=core POSTGRES_PASSWORD=wCh29&HE&T83 POSTGRES_DB=coredb ENV=PROD DJANGO_ALLOWED_HOSTS=EC2_IP_ADDRESS,EC2_INSTANCE_URL
Ensure to replace the EC2_IP_ADDRESS and the EC2_INSTANCE_URL with the values of your EC2 instance. You will also need to allow TCP connections on port 80 to allow HTTP requests on the EC2 instances for the whole configuration to work.
Figure 14.8 – Allowing HTTP requests
You can also remove the 8000 configurations as NGINX handles the redirection of HTTP requests to 0.0.0.0:8000 automatically.
With the concept of CI/CD understood and GitHub Actions explained and written, you have all the tools you need now to automate deployment on EC2 instances and any server. Now that the backend is deployed, we can move on to deploying the React frontend, not on an EC2 instance but on AWS Simple Storage Service (S3).
In this chapter, we have finally automated the deployment of the Django application on AWS using GitHub Actions. We have explored the concepts of CI and CD and how GitHub Actions allow the configuration of such concepts.
We have written a GitHub action file with jobs to build and run the test suites, and if these steps are successful, we run the deploy job, which is just connecting to the EC2 instance, and run a script to pull changes, build new images, and run the containers.
In the next chapter, we will learn how to deploy the React application using a service such as AWS S3.