In this post, I will show you proper debugging of a real-world project. Debugging is not adding print or log statements and rely on output of those methods. There could be underlying cause of a bug that is not visible without debugging the project.

We will set up debugger for the project by testdriven.io that processing asynchronous tasks with FastAPI and Celery. You can clone the repository from below:

GitHub - testdrivenio/fastapi-celery: Example of how to handle background processes with FastAPI, Celery, and Docker
Example of how to handle background processes with FastAPI, Celery, and Docker - testdrivenio/fastapi-celery

Identifying Dependencies

Each project has dependencies such as databases, queues, message brokers or any other external service layer that needs to be considered when setting of debugging. Because the core of our application needs to interact with these services to have entire logic working.

The project we have right now requires to interact with Celery and Redis. So, we can use Docker containers to run and up required services in seconds and make it ready to use.

That’s the beauty of Docker allows you to run multiple services isolated from your environment and works perfectly with local development environment setups.

There are already docker-compose.yml file inside the project which manages the containers and makes it easy to setup local environment. The following services are listed in compose file:

  • web - FastAPI container runs the core logic of application
  • worker - Celery workers
  • redis - Redis
  • dashboard - Flower dashboard to monitor and manage Celery jobs and workers.

Preparing Dependencies

First, when we open project we can see a folder named project that hold FastAPI application or core logic under it and docker-compose.yml to run containers.

Now, we want to debug the internal logic of our core FastAPI application, we will exclude the web and instead run it through debugger. The FastAPI will be launched from VSCode debugger with all external running dependencies inside Docker and we will map ports of dependency services to our local machine in order to able to connect them outside of Docker network.

Also, we want to debug celery worker which means worker container also has to be launched by debugger.

Let’s create a separate docker compose file for debugging purposes.

docker-compose-dev.yml

version: '3.8'

services:

  redis:
    image: redis:7
    ports:
      - 6379:6379

  dashboard:
    build: ./project
    command: celery --broker=redis://redis:6379/0 flower --port=5555
    ports:
      - 5556:5555
    environment:
      - CELERY_BROKER_URL=redis://redis:6379/0
      - CELERY_RESULT_BACKEND=redis://redis:6379/0
    depends_on:
      - redis

In this compose file we only have redis with mapped ports to our local machine and Flower dashboard for task monitoring.

Now, you can run the required services by following command:

docker-compose -f docker-compose-dev.yml up -d

Setting up VSCode Debugger

Now we have containers running, it’s time set up debugger for the project.

Navigate debugging screen of VSCode (locates on left sidebar) you will see create a launch.json file option where it automatically creates a starter configuration based on project.

Once you have launch.json created replace the entire configuration with one below:

{
    // Use IntelliSense to learn about possible attributes.
    // Hover to view descriptions of existing attributes.
    // For more information, visit: <https://go.microsoft.com/fwlink/?linkid=830387>
    "version": "0.2.0",
    "configurations": [
        {
            "name": "Python Debugger: FastAPI",
            "type": "debugpy",
            "request": "launch",
            "module": "uvicorn",
            "args": [
                "main:app",
                "--reload"
            ],
            "env": {
                "PYTHONPATH": "${workspaceRoot}/project",
                "CELERY_BROKER_URL": "redis://localhost:6379/0",
                "CELERY_RESULT_BACKEND": "redis://localhost:6379/0"
            },
            "jinja": true,
            "justMyCode": false
        },
        {
            "name": "Celery Worker",
            "type": "python",
            "request": "launch",
            "module": "celery",
            "console": "integratedTerminal",
            "justMyCode": true,
            "env": {
                "PYTHONPATH": "${workspaceRoot}/project",
                "CELERY_BROKER_URL": "redis://localhost:6379/0",
                "CELERY_RESULT_BACKEND": "redis://localhost:6379/0"
            },
            "args": [
                "-A",
                "worker.celery",
                "worker",
                "--loglevel=info",
                "--logfile=project/logs/celery.log"
            ],
        }
    ]
}

There are key points need to consider when creating debug configurations:

  • Define PYTHONPATH environment variable to point the location of Python libraries. The folder named project is the parent module so we set it as python path.
  • The application internally using environment variables to connect dependencies. That means we need to set all required environment variables properly to make app work. We mapped redis to port 6379 on our local machine that’s why we’re not using container name directly because debugger launching from our local machine and has no idea of internal docker network.
  • The args is holding arguments that defined in docker-compose file that passed main command to run specific service. Both FastAPI and Celery Worker using exact same arguments defined in docker compose file but in a way that debugger configuration supports.

Starting Debugger and Refactoring Codebase

The last steps are to create virtual environment with all installed packages from requirements.txt and make a slight tweak inside the app to make it run properly.

Now, if you start the debugger for FastAPI it will throw an error:

RuntimeError: Directory 'static' does not exist

That means, python can’t find the static directory even we defined PYTHONPATH in debugger configuration. So, let’s take a look the code:

project/main.py

app = FastAPI()
app.mount("/static", StaticFiles(directory="static"), name="static")
templates = Jinja2Templates(directory="templates")

Seems StaticFiles class can not initialise properly. However, it works fine when running the app from Docker (web service) because of volume mapping (take a look to docker-compose file). It means that the directory **./**project on the host machine will be mounted into the directory /usr/src/app within the container. So, just passing a directory name is enough to find it in container.

For debugging purposes, we will slightly refactor the code and provide correct path to find static directory within our local machine:

app = FastAPI()
app.mount("/static", StaticFiles(directory=f"{os.environ.get('PYTHONPATH')}/static"), name="static")
templates = Jinja2Templates(directory="templates")

Concatenation of PYTHONPATH and /static will do the trick and create and absolute path to the required folder. Try to start the FastAPI debugger again and it will work smoothly. Then we can also run Celery Worker debugger as well to debug celery worker.

That’s it and our application is ready to debug by setting breakpoints to required places.

Wrap up

We created debugging environment for a random project on internet. It’s similar in real-world applications where you need to identify the dependencies, setting up debugger configuration with proper environment variables, launch types, commands, arguments and make slight refactoring in the core logic if necessary.

The main goal is to make the sure the part or service you are debugging is communicating properly with it’s dependencies.

This post cross-published with OnePublish