Table of contents
Managing databases efficiently is crucial in the ever-evolving landscape of modern software development. PostgreSQL, a powerful open-source relational database, has become a go-to choice for many developers due to its robustness, extensibility, and adherence to SQL standards. However, setting up and managing a database can often be cumbersome, especially when ensuring consistent environments across development, testing, and production. Enter Docker, the game-changer in containerization. In this post, we'll explore how Docker simplifies PostgreSQL deployment, making it a breeze to spin up a database, persist data, and manage it with a user-friendly GUI. Whether you're a seasoned developer or just starting, by the end of this guide, you'll have a PostgreSQL instance running in a Docker container, complete with data persistence and the powerful pgAdmin interface at your fingertips.
What are Prerequisites?
To get the most out of this blog post, you should have:
Docker Basics: A fundamental understanding of Docker is essential. If you haven't used Docker before or need a refresher, check out our previous blog post where we covered:
What containers are and how they differ from virtual machines
Installing Docker on your system (Windows, macOS, or Linux)
Basic Docker commands like
docker run
,docker ps
,docker stop
, anddocker rm
Understanding Docker images and how to pull them from Docker Hub
Docker Compose: While not strictly necessary (you could do everything with
docker run
commands), Docker Compose simplifies the process. We'll use it in this tutorial, so make sure you have it installed. It usually comes bundled with Docker Desktop for Windows and macOS. For Linux, you might need to install it separately.
How to set up the Postgres DB using Docker?
To use the Postgres DB in your project you just need a connection string and the adminer
to visualize the tables correctly.
So first create a db
folder and in that you just have to create Docker-compose.yml
file and in that you have to write the following command.
version: '3'
services:
db:
image: postgres
restart: always
volumes:
- ./data/db:/var/lib/postgresql/data
ports:
- 5432:5432
environment:
- POSTGRES_DB=Tutorial
- POSTGRES_USER=admin
- POSTGRES_PASSWORD=admin@123
adminer:
image: adminer
restart: always
ports:
- 8080:8080
version: '3'
: This specifies the Docker Compose file format version. Version 3 is good, as it's widely supported.services
: This section defines the services (containers) you want to run.db
service:image: postgres
: Pulls the latest PostgreSQL image. It's better to specify a version for reproducibility, e.g.,postgres:13
.restart: always
: Good! This ensures the container restarts if it crashes or if Docker restarts.volumes
: Correctly maps the local./data/db
directory to the container's data directory, ensuring data persistence.ports
: Maps container port 5432 to host port 5432. Your comment is helpful for troubleshooting.environment
: Sets up the initial database, user, and password. These are correct.
adminer
service:
image: adminer
: Pulls the latest Adminer image. This is fine, but you can also pin a version.restart: always
: Good for keeping Adminer available.ports
: Maps adminer's web interface tolocalhost:8080
.
To run these multiple containers and services at once, we will run the command
docker-compose up -d
This will run the .yml
configuration in the docker and that will run the different services.
To check whether the database is running correctly we will go to the localhost:8080 and you can also check this using this command which will list all the running containers.
docker ps
Now to use this database in your project, all you need is a connection string that will look like this:
'postgresql://admin:admin%40123@localhost:5432/Tutorial'
And there you have it! Your PostgreSQL database is now running smoothly in a Docker container, with data safely persisted and easily managed through Adminer. This setup isn't just a time-saver; it's a gateway to modern DevOps practices. You've gained skills in containerization and database management that will serve you in projects big and small. So, dive into your code, build those APIs, and craft amazing data-driven applications. And remember, every bit of knowledge shared makes our dev community stronger. So, spread the word, help a fellow coder, and keep the learning journey alive. Happy coding, and may your next project be your best yet! ๐๐ฉโ๐ป๐จโ๐ป