Maintaining code quality across a growing project is a nightmare if you’re relying solely on manual peer reviews. In my experience, catching a critical bug during a PR is a win, but catching it before the PR via static analysis is a superpower. That is where SonarQube comes in.
If you are looking for a way to get your environment up and running without wrestling with complex installation scripts, this sonarqube docker compose tutorial will show you exactly how to orchestrate SonarQube and its required PostgreSQL database using Docker.
Whether you are a solo dev or part of a small team, having a local quality gate ensures that technical debt doesn’t accumulate unnoticed. If you’re completely new to the tool, I highly recommend checking out my sonarqube tutorial for beginners before diving into the infrastructure side.
Prerequisites
Before we start the deployment, ensure you have the following installed on your machine:
- Docker Desktop: Ensure it is running and has at least 4GB of RAM allocated (SonarQube is resource-intensive).
- Docker Compose: Included by default in Docker Desktop for Windows and Mac.
- Basic Terminal Knowledge: Ability to navigate directories and run shell commands.
Step-by-Step SonarQube Docker Compose Setup
Step 1: Create the Project Structure
I always suggest keeping your infrastructure files organized. Create a dedicated folder for your SonarQube setup to avoid permission issues with Docker volumes.
mkdir sonarqube-setup
cd sonarqube-setup
mkdir sonarqube_data postgres_data
Step 2: Configure the Docker Compose File
The magic happens in the docker-compose.yml file. SonarQube requires a database to store its analysis results; I prefer PostgreSQL for its reliability and compatibility. Create a file named docker-compose.yml and paste the following configuration:
version: '3.8'
services:
sonarqube:
image: sonarqube:community
depends_on:
- db
ports:
- "9000:9000"
networks:
- sonarnet
environment:
- SONAR_JDBC_URL=jdbc:postgresql://db:5432/sonar
- SONAR_JDBC_USERNAME=sonar
- SONAR_JDBC_PASSWORD=sonar
volumes:
- ./sonarqube_data:/opt/sonarqube/data
- ./sonarqube_extensions:/opt/sonarqube/extensions
- ./sonarqube_logs:/opt/sonarqube/logs
db:
image: postgres:15
networks:
- sonarnet
environment:
- POSTGRES_USER=sonar
- POSTGRES_PASSWORD=sonar
- POSTGRES_DB=sonar
volumes:
- ./postgres_data:/var/lib/postgresql/data
networks:
sonarnet:
driver: bridge
Step 3: Launch the Infrastructure
Run the following command in your terminal to start the services in detached mode:
docker-compose up -d
Wait about 60-90 seconds. SonarQube takes a moment to initialize the database and start the ElasticSearch engine. You can monitor the progress by running docker-compose logs -f sonarqube.
Once the logs indicate the server is up, navigate to http://localhost:9000 in your browser. The default credentials are admin / admin. You will be prompted to change the password immediately upon first login.
Step 4: Running Your First Analysis
Now that the server is running, you need a scanner to send your code to the dashboard. You can use the SonarScanner CLI or integrate it into your pipeline. As shown in the image below, the UI will guide you through creating a project and generating a token for authentication.
If you want to automate this in a real-world scenario, you should look into setting up a github actions code quality workflow to trigger these scans on every push.
Pro Tips for SonarQube Performance
- Increase Virtual Memory: On Linux, SonarQube often fails to start because the
vm.max_map_countis too low. Runsudo sysctl -w vm.max_map_count=262144to fix this. - Use Named Volumes: For production-like setups, use named Docker volumes instead of bind mounts (
./data) to improve I/O performance. - Resource Limits: In your compose file, I recommend adding
deploy: resources: limits: memory: 2Gto prevent SonarQube from consuming all your system RAM.
Troubleshooting Common Issues
| Issue | Cause | Solution |
|---|---|---|
| Container exits immediately | Insufficient RAM or VM max_map_count | Increase Docker RAM to 4GB and check sysctl settings. |
| Connection Refused (Port 9000) | Database not ready yet | Wait 1 minute or check docker-compose logs db. |
| Permission Denied on volumes | Linux file ownership mismatch | Run chmod -R 777 sonarqube_data (local only!). |
What’s Next?
Now that you have your local instance running, the next step is to define your Quality Gates. A Quality Gate is a set of boolean conditions that a project must meet before it can be merged. For example, you can mandate that “New Code must have 0 Critical Bugs” and “Coverage must be above 80%”.
If you’re feeling overwhelmed by the number of rules, start by disabling the ones that don’t fit your team’s style and gradually tighten the screws. Happy scanning!