Virtual Environments & Dependency Management Best Practices
Master professional Python dependency management, virtual environments, and reproducible installs.
๐งฐ What You'll Learn
This lesson teaches how to build professional Python applications with:
- Virtual environments for project isolation
- Clean dependency management
- Reproducible installs across machines
- Modern tooling (pip-tools, Poetry, UV)
- Security and supply-chain safety
- CI/CD integration
This separates "it works on my laptop" scripts from real, deployable software.
๐ฅ Python Download & Setup
Download Python from: python.org/downloads
Latest version recommended (3.11+)
Part 1: Virtual Environment Fundamentals
๐ฅ 1. Why You Should Never Rely on "System Python"
On most machines, you'll have System Python (used by OS tools) and Your project Python (what you control).
If you install packages globally with pip install requests, you risk:
| โ Problem | What Happens | โ Virtual Env Solution |
|---|---|---|
| Version conflict | Project A needs Django 3.2, Project B needs Django 5.0 | Each project has its own Django version |
| OS breaks | You update a package the OS depends on | System Python stays untouched |
| Dependency chaos | Can't remember which packages belong to which project | Each project has its own requirements.txt |
Rule #1: Each serious project gets its own isolated environment.
๐งช 2. What a Virtual Environment Actually Is
A virtual environment is:
- A folder containing its own Python interpreter
- With its own site-packages directory
- Isolated from global packages
Two projects can now safely require:
- Project A โ Django==3.2
- Project B โ Django==5.0
No conflict, because they use different environments.
โ๏ธ 3. Creating a Virtual Environment (venv)
Standard library way (no extra installs):
Create Virtual Environment
Commands to create a virtual environment
# Use the Python you want (3.10 example)
# Run in terminal: python3.10 -m venv .venv
# This creates a .venv/ folder in your project
print("Virtual environment commands:")
print("python3.10 -m venv .venv")
print("This creates a .venv/ folder in your project")Activating it:
macOS / Linux / WSL:
Activate on macOS/Linux
Command to activate virtual environment on Unix systems
# Run in terminal:
# source .venv/bin/activate
print("Activation command for macOS/Linux/WSL:")
print("source .venv/bin/activate")Windows (PowerShell):
Activate on Windows
Command to activate virtual environment on Windows
# Run in PowerShell:
# .venv\Scripts\Activate.ps1
print("Activation command for Windows PowerShell:")
print(".venv\\Scripts\\Activate.ps1")Your prompt usually changes to:
(.venv) $From now on:
- python = the one inside .venv
- pip = the one inside .venv
Deactivating:
Deactivate Environment
Command to deactivate virtual environment
# Run in terminal:
# deactivate
print("Deactivation command:")
print("deactivate")๐งฑ 4. Best Practices for Environment Layout
Recommended pattern per project:
Project Layout
Recommended project structure
# Recommended project structure:
project_structure = '''
my_project/
โโ .venv/ # virtual environment (not committed)
โโ src/ or app/ # your code
โโ tests/
โโ requirements.txt # pinned dependencies
โโ requirements-dev.txt
โโ pyproject.toml # (optional, for modern tooling)
โโ README.md
'''
print(project_structure)Why .venv?
- Consistent across all your projects
- Editors like VS Code automatically detect it
- Easy to add to .gitignore
.gitignore should include:
.gitignore Entries
What to add to .gitignore
# Add these to .gitignore:
gitignore_entries = '''
.venv/
__pycache__/
*.pyc
'''
print(".gitignore entries:")
print(gitignore_entries)Never commit your virtual environment to Git.
๐ 5. Installing Packages the Right Way
Once your environment is activated:
Install Packages
How to install packages with pip
# Run in terminal (with venv activated):
# python -m pip install requests fastapi uvicorn
print("Install command:")
print("python -m pip install requests fastapi uvicorn")Why python -m pip?
- Guarantees you're using the pip tied to that Python
- Avoids weird "wrong pip" issues
Check installed packages:
List Packages
How to list installed packages
# Run in terminal:
# python -m pip list
print("List installed packages:")
print("python -m pip list")Avoid installing directly with pip outside an activated environment.
๐ฆ 6. requirements.txt & Pinning Versions
requirements.txt like a recipe with exact measurements. "1 cup flour" vs "some flour" โ pinned versions ensure everyone gets the same result every time!For reproducibility, you want to freeze exactly which versions you're using.
Create a requirements.txt:
Freeze Requirements
How to create requirements.txt
# Run in terminal:
# python -m pip freeze > requirements.txt
print("Freeze command:")
print("python -m pip freeze > requirements.txt")This file might look like:
Requirements Example
Example requirements.txt content
# Example requirements.txt content:
requirements = '''
fastapi==0.115.0
uvicorn==0.32.0
pydantic==2.9.0
requests==2.32.3
'''
print("requirements.txt:")
print(requirements)Now anyone can recreate your environment:
Recreate Environment
Commands to recreate environment from requirements
# Commands to recreate environment:
commands = '''
python -m venv .venv
source .venv/bin/activate # or Windows equivalent
python -m pip install -r requirements.txt
'''
print("Recreate environment:")
print(commands)This is critical for:
- Deploying to servers
- Running on a teammate's machine
- Long-term maintenance
๐ 7. Semantic Versioning & Safer Constraints
Not all dependencies need to be fully pinned, but you should understand version ranges:
| Syntax | Meaning | When to Use |
|---|---|---|
==1.4.3 | Exact version only | Production apps (safest) |
>=1.4,<2.0 | Any 1.x version | Libraries you publish |
~=1.4 | Compatible (1.4.x only) | Balanced approach |
For libraries you publish: allow ranges (e.g. >=1.4,<2.0)
For apps you deploy: pinned versions (==) are safest
๐งช 8. Separating Runtime vs Dev Dependencies
Don't ship your test tools to production. Use two files:
requirements.txt (runtime):
Runtime Requirements
Example runtime requirements
# requirements.txt (runtime only):
runtime_deps = '''
fastapi==0.115.0
uvicorn==0.32.0
pydantic==2.9.0
'''
print("requirements.txt (runtime):")
print(runtime_deps)requirements-dev.txt (development):
Dev Requirements
Example dev requirements with -r include
# requirements-dev.txt (development):
dev_deps = '''
-r requirements.txt
pytest==8.3.0
black==24.10.0
ruff==0.7.0
mypy==1.13.0
'''
print("requirements-dev.txt:")
print(dev_deps)Then install dev tools like:
Install Dev Dependencies
How to install dev dependencies
# Install dev dependencies:
print("python -m pip install -r requirements-dev.txt")Your production environment only installs requirements.txt, keeping it:
- Lighter
- More secure
- Faster to deploy
๐งฐ 9. Using pip-tools / Poetry / UV (Modern Workflows)
As projects get bigger, plain pip freeze becomes messy. Three common modern approaches:
1) pip-tools
Workflow:
pip-tools Workflow
Using pip-tools for dependency management
# pip-tools workflow:
workflow = '''
pip install pip-tools
# requirements.in:
# fastapi
# uvicorn
# Compile:
pip-compile requirements.in
# Install:
pip-sync requirements.txt
'''
print("pip-tools workflow:")
print(workflow)This ensures your environment matches exactly the file.
2) Poetry
Uses pyproject.toml + poetry.lock. You define high-level deps; Poetry resolves & locks everything.
3) UV / Rye / Hatch
Newer tools that combine:
- Env creation
- Dependency resolution
- Locking
- Scripts
๐ 10. Global Python Tools: Use pipx, Not Your Project venv
Some Python packages are tools, not libraries, e.g.:
- httpie
- black
- ruff
- pre-commit
These are better installed with pipx globally:
Using pipx
Installing global tools with pipx
# Install global tools with pipx:
print("pip install pipx")
print("pipx install httpie")This:
- Keeps them isolated from projects
- Avoids interfering with your app dependencies
- Lets you use tools from your shell, no activation needed
๐ง 11. Environment Rebuild Strategy
A good habit whenever things feel "weird":
Rebuild Environment
Steps to rebuild a virtual environment
# Environment rebuild steps:
steps = '''
# 1. Delete .venv/
rm -rf .venv
# 2. Re-create environment
python -m venv .venv
source .venv/bin/activate
# 3. Reinstall from requirements.txt
python -m pip install -r requirements.txt
'''
print("Rebuild steps:")
print(steps)If the problem disappears, it was likely:
- Stale dependencies
- Partial upgrades
- Broken wheels
This rebuild pattern is common in real teams.
Part 2: Advanced Dependency Management
๐งฉ 12. Solving Dependency Conflicts Like a Pro
You'll eventually hit something like:
ERROR: Cannot install PackageA because PackageB requires version <2.0.0This is a dependency conflict โ two packages require incompatible versions of a dependency.
How to diagnose the conflict:
Install a dependency tree visualizer:
Dependency Tree
Using pipdeptree to diagnose conflicts
# Diagnose dependency conflicts:
commands = '''
pip install pipdeptree
pipdeptree
# Example output:
# fastapi==0.115.0
# - pydantic==2.9.0
# old-tool==1.3.0
# - pydantic==1.10.0
'''
print("Dependency tree commands:")
print(commands)How to fix it:
- Identify which package is outdated
- Upgrade/downgrade the conflicting package
- Reinstall
- If successful:
pip freeze > requirements.txt
๐งฑ 13. Lockfiles & Reproducible Installs
With basic pip:
Lockfile with pip
Using requirements.txt as a lockfile
# requirements.txt acts as a lockfile:
# - pin all versions
# - ensure identical installs for every machine
print("Upgrade and freeze:")
print("pip install --upgrade fastapi")
print("pip freeze > requirements.txt")With pip-tools (recommended):
Keep human-friendly requirements in requirements.in:
requirements.in
Human-friendly requirements file
# requirements.in (human-friendly):
requirements_in = '''
fastapi
uvicorn
'''
print("requirements.in:")
print(requirements_in)Compile into a fully pinned lockfile:
Compile Requirements
Compile requirements.in to requirements.txt
# Compile to lockfile:
print("pip-compile requirements.in")It produces requirements.txt with all sub-dependencies pinned.
The principle:
Keep one simple file for what you WANT, another for the exact resolved versions you MUST use.
๐งฌ 14. Per-Environment Dependencies (Dev, Test, Production)
Large apps usually have:
- dev environment (testing, debugging tools)
- production environment (runtime packages only)
- testing environment (for CI)
A clean structure looks like:
Requirements Structure
Multi-environment requirements layout
# Multi-environment requirements structure:
structure = '''
requirements/
โโ base.in
โโ dev.in
โโ prod.in
โโ base.txt
โโ dev.txt
โโ prod.txt
'''
print(structure)Example:
requirements/base.in:
base.in
Base dependencies
# requirements/base.in:
base = '''
fastapi
sqlalchemy
pydantic
'''
print("base.in:")
print(base)requirements/dev.in:
dev.in
Dev dependencies
# requirements/dev.in:
dev = '''
-r base.in
pytest
mypy
black
'''
print("dev.in:")
print(dev)requirements/prod.in:
prod.in
Production dependencies
# requirements/prod.in:
prod = '''
-r base.in
gunicorn
prometheus-client
'''
print("prod.in:")
print(prod)Compile:
Compile All
Compile all requirement files
# Compile all requirement files:
print("pip-compile requirements/base.in")
print("pip-compile requirements/dev.in")
print("pip-compile requirements/prod.in")Install:
Install for Environment
Install requirements for specific environment
# Install for specific environment:
print("pip install -r requirements/dev.txt")
print("pip install -r requirements/prod.txt")๐ 15. Internal Libraries & Private Packages
Businesses often create reusable internal libraries. These can be installed:
1. As editable local packages:
Editable Install
Install local package in editable mode
# Install local package in editable mode:
print("pip install -e ./core_engine")2. From a private PyPI server:
Private PyPI
Install from private PyPI server
# Install from private PyPI:
print("pip install core-engine --extra-index-url https://packages.acmetools.com/simple")Best practices:
- Use semantic versioning (1.0.0, 1.1.0, etc.)
- Pin specific versions in your main project
- Avoid installing directly from main branches
๐ก 16. Security & Supply-Chain Safety
Dependencies are attack vectors. Follow these rules:
โ 1. Scan dependencies regularly:
pip-audit
Scan dependencies for vulnerabilities
# Scan for vulnerabilities:
print("pip install pip-audit")
print("pip-audit")โ 2. Stick with reputable packages
Prefer libraries that have:
- Lots of downloads
- Active maintenance
- Frequent updates
- Reliable documentation
โ 3. Pin versions in production
Floating version ranges can pull in a bad update.
โ 4. Audit new dependencies
Ask:
- Do I really need this?
- Is this package trustworthy?
- Could I write this functionality myself?
Small dependency trees = safer, easier to maintain.
๐ณ 17. Using Virtual Environments Inside Docker
Docker isolates processes, but venvs add clarity and consistency.
Example Dockerfile:
Dockerfile Example
Using venv inside Docker
# Example Dockerfile with venv:
dockerfile = '''
FROM python:3.11-slim
WORKDIR /app
RUN python -m venv /app/.venv
ENV PATH="/app/.venv/bin:$PATH"
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
CMD ["python", "main.py"]
'''
print(dockerfile)Local + Docker both use .venv โ consistent setup.
โ๏ธ 18. Clean Install Testing in CI Pipelines
In CI tools like GitHub Actions or GitLab:
- Create fresh venv
- Install dependencies
- Run tests
Example:
GitHub Actions Example
CI pipeline with venv
# GitHub Actions workflow example:
workflow = '''
- uses: actions/setup-python@v5
with:
python-version: "3.11"
- name: Create venv
run: python -m venv .venv
- name: Install deps
run: |
source .venv/bin/activate
pip install -r requirements-dev.txt
- name: Run tests
run: |
source .venv/bin/activate
pytest
'''
print(workflow)This ensures your project:
- Installs cleanly
- Has no hidden local dependencies
- Is production-ready
๐ง 19. Workflow & Naming Conventions
Clean conventions prevent chaos:
- Always name virtual env
.venv - Use requirements.in + requirements.txt OR Poetry (don't mix)
- Document setup in README.md
- Never use global interpreter for real apps
- Recreate environment after major upgrades
Example recommended README snippet:
README Setup
Example README setup instructions
# README setup instructions:
readme = '''
1. python -m venv .venv
2. source .venv/bin/activate
3. pip install -r requirements-dev.txt
'''
print(readme)๐ฏ 20. The Master Mental Model
Everything you do around dependencies fits into this model:
Python version โ Virtual Environment โ Dependencies โ Lockfile
- Change Python version โ everything can break
- Skip virtual env โ dependencies collide
- Forget to freeze versions โ inconsistent behavior
- Skip lockfile โ unpredictable installs
Master this, and your projects become:
- โ Stable
- โ Reproducible
- โ Portable
- โ Easy to debug
- โ Easy to deploy
Part 3: Professional-Grade Patterns & Production
๐งช 21. Example: Clean Setup Workflow for a New Project
A solid, repeatable pattern for any new Python project:
New Project Setup
Complete workflow for new project
# Complete new project setup:
setup = '''
# 1. Create the project folder
mkdir awesome-project
cd awesome-project
# 2. Create and activate a virtual environment
python -m venv .venv
# macOS / Linux
source .venv/bin/activate
# Windows (PowerShell)
.venv\\Scripts\\Activate.ps1
# 3. Install core dependencies
pip install fastapi uvicorn pydantic
# 4. Freeze versions
pip freeze > requirements.txt
# 5. Install dev-only tools
pip install pytest black mypy
pip freeze > requirements-dev.txt
'''
pri
...Example structure:
Project Structure
Example project folder structure
# Project structure:
structure = '''
awesome-project/
โโ .venv/
โโ app/
โ โโ __init__.py
โ โโ main.py
โโ tests/
โโ requirements.txt
โโ requirements-dev.txt
โโ README.md
'''
print(structure)Anyone who clones this can run:
Clone & Setup
Commands after cloning
# After cloning the project:
print("python -m venv .venv")
print("source .venv/bin/activate")
print("pip install -r requirements-dev.txt")and be ready to work.
๐ 22. Example: Multi-Service Environment Structure
For a slightly bigger setup (e.g. API + worker + shared library):
Multi-Service Structure
Structure for multiple services
# Multi-service project structure:
structure = '''
project-root/
โโ services/
โ โโ api/
โ โ โโ app/
โ โ โโ requirements.txt
โ โโ worker/
โ โ โโ worker_app/
โ โ โโ requirements.txt
โโ libs/
โ โโ shared_lib/
โ โโ setup.py or pyproject.toml
โโ README.md
'''
print(structure)Each service can have its own virtual environment:
Service venv
Create venv for a service
# Create venv for a specific service:
print("cd services/api")
print("python -m venv .venv")
print("source .venv/bin/activate")
print("pip install -r requirements.txt")Shared library can be installed in editable mode:
This keeps boundaries clear, especially when different services need different versions of frameworks.
๐งฉ 24. Handling OS-Specific and Optional Dependencies
Some packages are only needed on certain platforms or for extra features.
Optional dependencies pattern:
Optional Dependencies
Pattern for optional dependencies
# Optional dependencies pattern:
pattern = '''
# Base requirements:
# - fastapi
# - uvicorn
# Dev extras:
# - pytest
# - black
# Optional "redis" extras:
# - redis
'''
print(pattern)Keep separate requirements-*.txt files:
Separate Files
Different requirements files
# Separate requirements files:
files = '''
requirements.txt # core
requirements-dev.txt # dev tools
requirements-redis.txt # extra feature
'''
print(files)Install only what's needed:
Install Multiple
Install multiple requirements files
# Install multiple requirements:
print("pip install -r requirements.txt -r requirements-redis.txt")๐งฏ 25. Troubleshooting Common Virtualenv & Dependency Issues
Problem 1: "It works on my machine but not on theirs"
Likely causes:
- No pinned versions (requirements.txt missing or incomplete)
- Different Python version
- Some packages installed globally by accident
Fix:
- Ensure everyone uses the same Python major/minor (e.g. 3.11)
- Use a venv and requirements*.txt files
- Regenerate lockfiles after big upgrades
Problem 2: "ModuleNotFoundError even though I installed it"
Usually:
- Wrong interpreter selected in IDE
- venv not activated in the terminal
- Installed into global Python instead of the venv
Checklist:
Troubleshoot ModuleNotFound
Debugging module not found errors
# Troubleshooting checklist:
import sys
print("Current Python:", sys.executable)
print()
print("Terminal commands to check:")
print("which python # or 'where python' on Windows")
print("which pip # or 'where pip' on Windows")
print()
print("In VS Code: select interpreter pointing to .venv/bin/python")Problem 3: "pip install works locally but fails in CI"
Possible causes:
- OS differences (Linux vs Windows)
- Missing system libraries (for packages with C extensions)
- Extra private indexes not configured in CI
Fix:
- Use the same Python version as CI locally (e.g. via pyenv)
- Install system packages in CI (e.g. libpq-dev, build-essential)
- Make sure CI has the same PIP_INDEX_URL / extra-index-url
๐ 26. Structuring Requirements for Long-Term Maintainability
A common and very maintainable pattern:
Maintainable Structure
Long-term maintainable requirements
# Maintainable requirements structure:
structure = '''
requirements/
โโ base.in
โโ dev.in
โโ prod.in
โโ base.txt
โโ dev.txt
โโ prod.txt
'''
print(structure)This gives:
- A clear separation of intent (*.in) vs exact resolved versions (*.txt)
- Easy upgrades by editing .in and re-compiling
- Stable installs on all machines/environments
๐ง 27. Mental Models to Keep Your Environments Sane
A few simple rules keep everything under control:
- One project โ one virtual environment
Don't reuse venvs across unrelated projects.
- Never install app dependencies globally
Global Python is for tooling at most, not for full stacks.
- Pin versions for anything serious
Experiments can be loose, but production/teaching projects should be pinned.
- Document setup steps once, reuse everywhere
A short "Setup" section in your README saves hours of debugging.
- Treat requirements.txt or poetry.lock as part of your code
They're versions of your environment โ commit them and keep them updated.
โ 28. Quick Checklist for a "Professional" Environment Setup
If you can answer yes to these, your dependency game is solid:
If any are "no", that's the next thing to improve.
๐ Final Summary
You've now mastered professional Python dependency management and virtual environments.
You can now:
- Create isolated, reproducible Python environments
- Manage dependencies with requirements.txt and modern tools
- Structure multi-environment projects (dev/test/prod)
- Debug common dependency issues
- Integrate virtual environments with Docker and CI/CD
- Follow security best practices for supply-chain safety
These skills are essential for building production-ready Python applications used by professional teams worldwide.
๐ Quick Reference โ Virtual Environments
| Command | What it does |
|---|---|
| python -m venv .venv | Create a virtual environment |
| source .venv/bin/activate | Activate (Linux/macOS) |
| .venv\Scripts\activate | Activate (Windows) |
| pip freeze > requirements.txt | Save installed packages |
| pip install -r requirements.txt | Install from requirements file |
๐ Great work! You've completed this lesson.
You can now create isolated environments, manage dependencies cleanly, and use modern tools like Poetry and pyenv.
Up next: Packaging โ publish your own Python libraries to PyPI for others to use.
Sign up for free to track which lessons you've completed and get learning reminders.