- Explain why global package installations cause dependency conflicts and create a virtual environment using venv
- Use pip and requirements.txt to install, freeze, and reproduce a project's dependencies
- Write a pyproject.toml file that defines a distributable Python package with its metadata and dependencies
- Compare modern dependency management tools (pip-tools, Poetry, uv) and select the appropriate tool for a project
- Describe how the src layout and editable installs work together to produce a clean development workflow
Every Python project works perfectly on the machine where it was written. The problems start when someone else tries to run it. They have a different version of a library, or the library your code needs conflicts with a library their other project needs, or they install your package and it overwrites a dependency that breaks something else entirely. Dependency management is not glamorous, but it is the difference between software that works and software that works only on your laptop.
The Dependency Problem
Imagine you have two projects. Project A needs requests version 2.28. Project B needs requests version 2.31. If both install into the same global Python environment, one of them will break — there is only one requests package installed at a time. This is the dependency conflict problem, and it gets worse with every project you maintain.
The global site-packages directory is a shared, mutable resource. Every pip install modifies it. Every project that installs there is implicitly coupled to every other project. Change one dependency and you might break five unrelated programs.
The solution is isolation. Each project gets its own copy of the packages it needs, completely independent of every other project. Python's tool for this is the virtual environment.
Creating Virtual Environments with venv
A virtual environment is a self-contained directory that holds a Python interpreter and its own site-packages. The venv module, built into Python since version 3.3, creates one:
python3 -m venv .venv
This creates a .venv directory containing a copy of the Python interpreter and an empty site-packages. To use it, you activate the environment:
# macOS / Linux
source .venv/bin/activate
# Windows
.venv\Scripts\activate
Your shell prompt changes to show the active environment. Now pip install installs packages into .venv/lib/, not the global site-packages. python runs the interpreter from .venv/bin/. Your project is isolated.
When you are done, deactivate returns you to the global environment:
deactivate
The .venv directory should be added to .gitignore — it is local, reproducible, and can be regenerated from your dependency specification. Never commit it to version control.
pip and requirements.txt
With a virtual environment active, pip is your package installer:
pip install requests
pip install flask==3.0.0 # specific version
pip install "numpy>=1.24,<2" # version range
To record exactly what is installed — so someone else can reproduce your environment — use pip freeze:
pip freeze > requirements.txt
This writes every installed package and its exact version to a file:
certifi==2024.2.2
charset-normalizer==3.3.2
flask==3.0.0
requests==2.31.0
urllib3==2.2.1
Anyone can then recreate your environment:
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
The file requirements.txt is a snapshot, not a specification. It includes every transitive dependency (the dependencies of your dependencies), which makes it precise but noisy. You often want two files: a hand-written requirements.in listing your direct dependencies, and a generated requirements.txt with the pinned versions. The tool pip-tools automates this — pip-compile resolves and pins, pip-sync installs exactly what is specified.
pyproject.toml: The Modern Standard
For years, Python packaging used setup.py — a Python script that called setuptools.setup() with your project metadata. It worked, but executing arbitrary Python during installation was fragile and insecure. The modern replacement is pyproject.toml, a declarative configuration file:
[project]
name = "my-awesome-lib"
version = "1.0.0"
description = "A library that does something useful"
requires-python = ">=3.10"
dependencies = [
"requests>=2.28",
"click>=8.0",
]
[project.optional-dependencies]
dev = [
"pytest>=8.0",
"coverage>=7.0",
]
[build-system]
requires = ["setuptools>=68.0"]
build-backend = "setuptools.backends._legacy:_Backend"
The [project] table defines your package: its name, version, description, and dependencies. The [build-system] table tells Python which tool builds the package. This separation of concerns — what the project is versus how to build it — is the key insight of pyproject.toml.
Optional dependencies let you group development tools separately. Install them with pip install -e ".[dev]" — more on that -e flag shortly.
The src Layout and Editable Installs
The recommended project structure uses a src/ directory:
my-project/
├── src/
│ └── mypackage/
│ ├── __init__.py
│ └── core.py
├── tests/
│ └── test_core.py
├── pyproject.toml
└── README.md
The src/ directory prevents a subtle bug: without it, Python can import your package directly from the project root, even if it is not installed. This means your tests might pass locally but fail when a user installs the package, because the installed version has different import paths.
During development, you install your package in editable mode:
pip install -e .
This creates a link from site-packages to your source directory. Changes to your code take effect immediately without reinstalling. The -e flag stands for "editable" — it is the development workflow equivalent of a symlink.
Building and Publishing a Package
When you are ready to distribute your code, build it into a distributable format:
pip install build
python -m build
This creates two files in a dist/ directory: a source distribution (.tar.gz) and a wheel (.whl). The wheel is the preferred format — it is pre-built and installs faster.
Publishing to PyPI (the Python Package Index) uses twine:
pip install twine
twine upload dist/*
You will need a PyPI account and an API token. After uploading, anyone in the world can install your package with pip install my-awesome-lib. For testing, use TestPyPI first — it is a separate instance where you can experiment without polluting the real index.
Not every project needs to be on PyPI. Many packages are internal tools, shared within a team via a private index or a Git URL. But understanding the packaging pipeline means you can distribute your code properly when you need to.
The Dependency Management Landscape
pip and requirements.txt work, but the ecosystem has evolved. Several tools offer more sophisticated dependency management:
pip-tools adds a compilation step: you write requirements.in with your direct dependencies, and pip-compile generates a fully pinned requirements.txt. This separates intent from implementation.
Poetry provides an all-in-one solution: dependency resolution, virtual environment management, building, and publishing. It uses pyproject.toml with its own [tool.poetry] section and a poetry.lock file for reproducible installs.
uv is the newest contender — a Rust-based tool that is dramatically faster than pip. It can create virtual environments, install packages, and resolve dependencies, all in a fraction of the time. It aims to be a drop-in replacement for pip, pip-tools, and virtualenv.
Which should you use? For learning and small projects, venv + pip + requirements.txt is perfectly adequate. For team projects, pip-tools or Poetry adds valuable reproducibility. For speed-sensitive workflows or large dependency trees, uv is increasingly hard to ignore.
Docker for Reproducibility
Virtual environments isolate Python packages, but they do not isolate the Python interpreter itself, the operating system, or system-level libraries. If your project depends on a specific OS package or a particular Python build, you need a deeper level of isolation.
Docker solves this by packaging your entire application — Python, packages, system libraries, and configuration — into a container. A minimal Dockerfile for a Python project might look like this:
FROM python:3.12-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY src/ src/
CMD ["python", "-m", "mypackage"]
Docker is beyond the scope of this chapter, but know that it exists at the top of the isolation hierarchy: venv isolates packages, Docker isolates everything.
Dependency management is the unglamorous foundation beneath every successful Python project. It is not exciting to create a virtual environment, pin your dependencies, and write a proper pyproject.toml. But it is exciting to hand your project to a colleague and have it work on the first try. It is exciting to come back to a project after six months and reproduce the environment in thirty seconds. The tools are simple. The discipline is what matters.