Install Docker on AWS Linux in 2023

You are currently viewing Install Docker on AWS Linux in 2023

Docker: It’s more than just a buzzword. In today’s cloud-centric world, containerization, led primarily by Docker, has redefined the way applications are developed, shipped, and run. By ensuring consistency across different stages of the application lifecycle, Docker containers make applications portable across various environments, be it a developer’s local setup or a cloud infrastructure like AWS.

AWS Linux: Tailored for the cloud, AWS Linux provides a secure, stable, and high-performance execution environment to develop and run cloud and enterprise applications. Marrying Docker’s container efficiency with AWS Linux’s cloud optimization can prove to be a game changer for many enterprises.

Prerequisites

Before diving into the Docker installation on AWS Linux, it’s crucial to set the stage right. Here’s what you’ll need:

  • AWS Account: If you don’t already have one, sign up for an AWS account here. Note that setting up an EC2 instance may incur charges.
  • EC2 instance with AWS Linux: You’ll need an EC2 instance running AWS Linux. If unsure about setting one up, AWS provides a comprehensive guide to help you get started.
  • Basic Linux Command Line Knowledge: This guide assumes you have a foundational understanding of the Linux command line. If not, there are plenty of resources available online to help you get started.
  • Recommended EC2 Instance Specifications: For Docker to run smoothly, consider using a t2.micro instance or higher. Ensure you have at least 1GB of RAM and 8GB of storage.

Setting Up the AWS Linux EC2 Instance

The efficiency of Docker on AWS largely depends on how well you’ve set up your EC2 instance. Here’s how to do it right:

Choosing the Right Instance Type: While t2.micro is suitable for small applications, if you anticipate heavier workloads, consider opting for more powerful instance types. Assess your application’s requirements and select accordingly.

Securing the Instance: Your EC2 instance should be a fortress. Begin by setting up a VPC (Virtual Private Cloud) for added isolation. Use security groups to define what traffic can reach your instance. Always use a key pair (private/public) for authentication; avoid password-based logins.

Accessing the EC2 Instance: Once your instance is up and running, access it using SSH. Typically, the command looks like this:

ssh -i path/to/your-key.pem ec2-user@your-ec2-ip-address

Make sure you replace path/to/your-key.pem with the path to your private key and your-ec2-ip-address with the public IP address of your EC2 instance.

Updating the System

Before we proceed with the Docker installation, it’s essential to ensure that your AWS Linux system is updated with the latest packages. This not only secures your system but also ensures compatibility with newer software.

sudo yum update -y

This command updates all the software packages to their latest versions. Once completed, reboot the system to apply all the changes.

sudo reboot

After the reboot, log back into your instance and you’re all set for the next steps.

Installing Docker on AWS Linux

With an updated system, you’re now poised to get Docker up and running:

Install Docker:

AWS Linux makes it straightforward to install Docker. Use the following command:

sudo yum install docker -y

Start the Docker Service:

Once Docker is installed, start the Docker service to make sure it runs in the background.

sudo service docker start

Verify Docker Installation:

To ensure that Docker was installed correctly and is running, execute the following:

sudo docker --version

This command should display the Docker version, indicating a successful installation.

6. Configuring Docker User Permissions

By default, running Docker commands requires root privileges. However, typing sudo for every Docker command can be cumbersome. Let’s fix that:

Add User to the Docker Group:

Create the Docker group (if it doesn’t already exist) and add your user to this group.

sudo usermod -aG docker $USER

Activate the Changes:

For the group addition to take effect, you either need to log out and log back in or use the following command

newgrp docker

Test Docker Without ‘sudo’:

Now, you should be able to run Docker commands without sudo. Try running:

docker run hello-world 

This should pull the ‘hello-world’ Docker image and run it, displaying a welcome message.

Starting and Enabling the Docker Service

Now that Docker is installed, it’s crucial to ensure that it runs seamlessly in the background and starts automatically after system reboots:

Start Docker Service: If you haven’t started the Docker service yet, you can do so with the following command:

sudo systemctl start docker

Enable Docker to Start on Boot:

To make sure Docker starts automatically after the system reboots, use:

sudo systemctl enable docker 

With this, Docker will automatically start every time your AWS Linux instance boots up.

Testing Docker Installation

Before diving deep into Docker’s world, it’s wise to ensure everything’s set up correctly:

Run the ‘Hello World’ Image:

This is a basic test to ensure Docker can pull and run containers.

docker run hello-world 

Upon execution, you should see a message from Docker, indicating that your installation appears to be working correctly.

Troubleshooting Common Issues:

Permission Denied: If you get a message about the Docker daemon not having permission, ensure that your user is added to the Docker group as mentioned in Section 6.

Cannot Connect to Docker Daemon: Ensure that the Docker service is running (sudo systemctl status docker).

No Space Left: Docker images can take up significant space. Regularly prune unused images using

docker system prune.

Setting Up Docker Compose (Optional)

While Docker is fantastic for running individual containers, Docker Compose allows you to define and run multi-container Docker applications. Here’s how to set it up:

Download Docker Compose:

Fetch the latest version of Docker Compose using curl. Replace 1.29.2 with the version number you want (or check the latest version on Docker’s GitHub page).

sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose

Apply Executable Permissions:

sudo chmod +x /usr/local/bin/docker-compose

Verify Installation: Check the installed version:

docker-compose --version 

This should display the Docker Compose version, confirming that it’s installed correctly.

Basic Docker Commands for Beginners

Now that Docker is up and running, here’s a quick primer on some basic commands to help you get started:

List Running Containers:

docker ps

List All Containers (including stopped ones):

docker ps -a

Run a Container:

docker run [image-name] 

For example, to run an nginx server:

docker run nginx

Stop a Container:

docker stop [container-id or container-name]

Remove a Container:

docker rm [container-id or container-name]

List Docker Images:

docker images

Pull a Docker Image:

docker pull [image-name]

Remove a Docker Image:

docker rmi [image-name]

Securing Docker Installation

Security is paramount, especially when deploying applications in production. Here are some tips to secure your Docker setup:

Regularly Update Docker: Always ensure that Docker is updated to the latest version to benefit from the latest security patches.

sudo yum update docker

Use Trusted Images: Only pull Docker images from trusted sources or official repositories. Avoid using random images which can have vulnerabilities.

Limit Container Privileges: Avoid running containers with the --privileged flag unless necessary.

Implement Docker Bench for Security: It’s a script that checks for dozens of common best practices around deploying Docker containers in production.

git clone https://github.com/docker/docker-bench-security.git 
cd docker-bench-security 
sudo sh docker-bench-security.sh

Enable Docker Content Trust: This ensures that all operations using a remote Docker registry are using signed images.

export DOCKER_CONTENT_TRUST=1

Conclusion

Congratulations! You’ve successfully installed Docker on AWS Linux, learned the basics, and taken steps to ensure its security. With Docker and AWS at your fingertips, you’ve unlocked a powerful combo for developing and deploying applications. Whether you’re looking to containerize a small web app or scale a large microservices architecture, your foundation is solid.

Remember, the world of Docker is vast and ever-evolving. Always stay updated, experiment with new tools, and keep optimizing your workflow.

Atiqur Rahman

I am MD. Atiqur Rahman graduated from BUET and is an AWS-certified solutions architect. I have successfully achieved 6 certifications from AWS including Cloud Practitioner, Solutions Architect, SysOps Administrator, and Developer Associate. I have more than 8 years of working experience as a DevOps engineer designing complex SAAS applications.

Leave a Reply