In today’s fast-paced world, automating the deployment of applications has become an essential practice for DevOps engineers and developers. Docker containers are an increasingly popular way to package and deploy applications because of their portability, consistency, and efficiency. When combined with the powerful automation capabilities of GitHub Actions, you can streamline your workflow to automatically push Docker images to AWS EC2 instances.
This guide will walk you through automating Docker image deployment to an EC2 instance using GitHub Actions. By the end, you’ll have a fully automated workflow that builds a Docker image, pushes it to a repository, and deploys it to an EC2 instance.
Why Automate with GitHub Actions?
GitHub Actions provide the following benefits:
- Automation: Save time by automatically triggering workflows on code changes or pull requests.
- Customization: Build highly customized workflows using pre-built actions or create your own.
- CI/CD Integration: Seamlessly integrate with continuous integration (CI) and continuous deployment (CD) pipelines.
- Ease of Use: Simple YAML syntax makes creating and managing workflows intuitive and straightforward.
Prerequisites
Before starting, make sure you have the following:
- AWS Account: You need an active AWS account to create and manage EC2 instances.
- GitHub Repository: A GitHub repository to host your project.
- Docker: Docker installed on your local machine.
- EC2 Instance: An EC2 instance running on AWS, properly configured with security groups, SSH keys, and other essential settings.
- GitHub Actions Secrets: Store sensitive data like AWS credentials as GitHub Secrets.
Step 1: Create a Dockerfile
The first step is to create a Dockerfile
in the root directory of your project. This file defines the Docker image for your application.
# Use an official Python runtime as a parent image
FROM python:3.8-slim-buster
# Set the working directory
WORKDIR /app
# Copy the current directory contents into the container
COPY . .
# Install any needed packages specified in requirements.txt
RUN pip install --no-cache-dir -r requirements.txt
# Make port 80 available to the world outside this container
EXPOSE 80
# Define environment variable
ENV NAME World
# Run app.py when the container launches
CMD ["python", "app.py"]
Step 2: Launch an EC2 Instance
Ensure you have an EC2 instance running and accessible over SSH. If you don’t have one, follow these steps:
- Go to the AWS Management Console.
- Navigate to EC2 Dashboard and click Launch Instance.
- Choose an Amazon Machine Image (AMI). For this example, use Amazon Linux 2.
- Choose the instance type (e.g., t2.micro for free-tier usage).
- Configure instance details, including Security Groups to allow inbound SSH and HTTP access.
- Launch the instance and note down the public IP address.
Step 3: Store AWS Credentials in GitHub Secrets
For GitHub Actions to interact with AWS services, you need to store your AWS credentials securely. Follow these steps:
- Navigate to your GitHub repository.
- Go to Settings > Secrets and Variables > Actions > New Repository Secret.
- Add the following secrets:
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
EC2_INSTANCE_IP
EC2_USER
(usuallyec2-user
for Amazon Linux orubuntu
for Ubuntu AMIs)
Step 4: Set Up GitHub Actions Workflow
Create a .github/workflows/deploy.yml
file in your repository to define your GitHub Actions workflow. The following workflow will build the Docker image, push it to an EC2 instance, and deploy it.
name: Deploy Docker Image to EC2
on:
push:
branches:
- main
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to DockerHub
run: echo "${{ secrets.DOCKERHUB_TOKEN }}" | docker login -u ${{ secrets.DOCKERHUB_USERNAME }} --password-stdin
- name: Build and push Docker image
run: |
docker build -t your-docker-image-name .
docker tag your-docker-image-name your-dockerhub-username/your-docker-image-name:latest
docker push your-dockerhub-username/your-docker-image-name:latest
- name: Install SSH Key
uses: webfactory/ssh-agent@v0.5.3
with:
ssh-private-key: ${{ secrets.SSH_PRIVATE_KEY }}
- name: Deploy Docker image to EC2
run: |
ssh -o StrictHostKeyChecking=no ${{ secrets.EC2_USER }}@${{ secrets.EC2_INSTANCE_IP }} << 'EOF'
docker pull your-dockerhub-username/your-docker-image-name:latest
docker stop $(docker ps -a -q) || true
docker run -d -p 80:80 your-dockerhub-username/your-docker-image-name:latest
EOF
Step 5: Trigger Deployment
Once the workflow is set up, any push to the main
branch will automatically trigger the workflow. The steps in the workflow include:
- Checkout the Code: It checks out your repository code.
- Build Docker Image: Builds your Docker image using the
Dockerfile
. - Push Docker Image: Pushes the built image to Docker Hub (you’ll need to store your Docker credentials in GitHub Secrets).
- Install SSH Key: Sets up SSH to connect to your EC2 instance.
- Deploy Image: Pulls the latest image from Docker Hub to your EC2 instance and runs the container.
Explore more detailed content and step-by-step guides on our YouTube channel:-
Conclusion
By integrating Docker, EC2, and GitHub Actions, you’ve successfully automated the deployment of Docker images to AWS EC2. This process enables continuous deployment, saving time and reducing the chance of human error. With GitHub Actions, you can further customize this workflow, add stages like testing, or even extend it to more complex environments like Kubernetes clusters or multi-instance deployments.
Connect with Me:
- YouTube ► S3 CloudHub Channel
- Facebook ► S3 CloudHub Page
- Medium ► S3 CloudHub Blog
- Demo Reference ► GitHub Repository
- Blog ► S3 CloudHub Blogspot
- Dev ► S3 CloudHub on Dev.to
No comments:
Post a Comment