New problem. Diane's warehouse tool depends on three third-party packages — openpyxl for Excel export, python-dateutil for delivery dates, tabulate for pretty-printing reports. You pip install them globally and everything works. Two weeks later, a colleague needs an older version of openpyxl for a different project. He upgrades it globally. Your warehouse tool breaks. What do you do?
Check which version I was using, try to pin it somehow... honestly I'd just reinstall things until it worked again. Which is a terrible answer.
It's the answer most people give before learning virtual environments. And it means you've been one pip install --upgrade away from a broken tool this entire time.
Okay. That's uncomfortable. What's the fix?
Think about how the warehouse handles Client A and Client B. Client A runs a strict inventory policy — every item logged in triplicate, restocking thresholds fixed, no exceptions. Client B is looser — spot checks, flexible thresholds. Do you run both clients out of the same warehouse with the same rules?
Never. You'd give each client their own warehouse. Different shelves, different procedures, completely isolated.
That's a virtual environment. One isolated Python installation per project. Client A's rules — its specific package versions — live in one warehouse. Client B's rules live in another. They never interfere:
# Create the isolated environment
python -m venv warehouse-env
# Activate it — rewrites PATH so 'python' and 'pip' point to the isolated copies
source warehouse-env/bin/activate # macOS/Linux
# warehouse-env\Scriptsctivate # Windows
# (your prompt shows: (warehouse-env) $ )
pip install openpyxl==3.1.2
pip install python-dateutil
pip install tabulate
# When done, exit the environment
deactivateA virtual environment is just a folder with its own site-packages directory? Not a full Python installation copy?
Almost. It points to a shared Python interpreter but has its own site-packages — its own shelf space. Creating one takes three seconds and uses almost no disk space. The packages you install inside it exist only there. Your colleague's upgrade changes his site-packages, not yours.
But when I give this project to someone else, they don't have my warehouse-env. And I shouldn't commit it to Git either, right? It's hundreds of megabytes of binaries.
Never commit the venv to Git. Add it to .gitignore. Instead you give them a requirements.txt:
# pip freeze > requirements.txtThat captures every installed package with its exact version:
openpyxl==3.1.2
python-dateutil==2.9.0
tabulate==0.9.0
et-xmlfile==1.1.0
six==1.16.0And they do pip install -r requirements.txt in their own venv and get the exact same environment?
Exactly. The -r flag means "read from file." The requirements.txt is the packing slip — it says which items, which specifications. The receiving warehouse follows the slip and stocks accordingly.
I've copy-pasted pip install -r requirements.txt from READMEs for two years without knowing what -r meant.
Welcome to understanding the commands you've been typing. Now — requirements files have version operators beyond just ==:
openpyxl==3.1.2 # exact version — this specific release only
requests>=2.28.0 # minimum — anything at or above this
tabulate~=0.9.0 # compatible release — 0.9.x only, no minor-version jumps
python-dateutil!=2.8.1 # exclusion — anything except this buggy releaseWhat does ~=0.9.0 actually mean? "Compatible release" sounds vague.
~=0.9.0 means "at least 0.9.0, but less than 0.10.0" — patches are allowed, breaking changes are not. It's shorthand for "I want bug fixes but not new APIs that might break my code." pip freeze always writes == because it captures exactly what you have. The other operators appear when humans write requirements files by hand.
So pip freeze is for locking — saying exactly what works today. The other operators are for specifying what range is acceptable. Different use cases.
That's a senior developer distinction. Locked == for applications — deploy exactly this. Flexible ranges for libraries that others will install alongside other packages.
This is the thing I've been missing. I've been treating pip as a global package manager — install things and hope for the best. Venvs make it systematic and reproducible.
Reproducible is the word. Now — parsing requirements files is a clean string processing problem. Each valid line is a package name optionally followed by a version operator and version string. The operators are ==, >=, <=, !=, ~=, >, <. Try sketching the parsing logic.
I'd split on the operator. But seven possible operators — some two characters, some one. I can't just .split("=") — that would break >=. I'd need to check for each operator, longest first.
Why longest first? What goes wrong if you check = before >=?
If I check = first on >=2.28.0, it finds the = inside >= and splits there — I lose the >. Checking two-character operators first means >= matches before anything tries to match just =.
That ordering is the whole trick. Check ~=, !=, ==, >=, <= before > and <. The rest is filtering comments and blank lines:
OPERATORS = ["~=", "!=", "==", ">=", "<=", ">", "<"]
def parse_requirements(requirements_text: str) -> list[dict]:
result = []
for line in requirements_text.splitlines():
line = line.strip()
if not line or line.startswith("#"):
continue
for op in OPERATORS:
if op in line:
name, version = line.split(op, 1)
result.append({"name": name.strip(), "operator": op, "version": version.strip()})
break
else:
result.append({"name": line, "operator": None, "version": None})
return resultThe for/else again — the else on the for loop runs if no break fired. So if no operator matched, the bare package name goes in with None values. Elegant.
Same for/else from Day 18 — "I searched the whole thing and didn't find it." This time it's "I tried all seven operators and none matched — bare name."
I remember feeling confused by for/else when I first saw it. Now I'm reaching for it naturally. Week 3 payoff.
I've been waiting for you to say that. One last thing before tomorrow. Venvs solve local isolation. They don't fully solve production deployment — that needs Docker, or pip-tools' lockfiles, or a package manager like poetry. For the inventory tool we're building, requirements.txt is exactly right. But know the ceiling exists.
So venvs are right for development and small projects. Larger production systems need something on top?
Right. And knowing the difference is what separates someone who ships code from someone who just writes it. Tomorrow is Day 28 — the capstone. Every piece from the last four weeks.
Every piece from four weeks. Tuples, sets, unpacking, function signatures, match statements, modules, packages, imports. I've been waiting for this since Day 1.
A virtual environment is an isolated Python environment with its own site-packages directory. Packages installed inside one virtual environment are invisible to other environments and to the global Python installation.
# Create a virtual environment named .venv (conventional name)
python -m venv .venv
# Activate (macOS/Linux)
source .venv/bin/activate
# Activate (Windows)
.venv\Scripts\activate
# Install packages (only into this environment)
pip install openpyxl==3.1.2 requests>=2.28.0
# Capture exact installed versions to a file
pip freeze > requirements.txt
# Restore the same environment elsewhere
pip install -r requirements.txt
# Exit the environment
deactivaterequirements.txt version operators:
| Operator | Meaning | Example |
|---|---|---|
== | Exact version | flask==2.3.0 |
>= | Minimum version | requests>=2.28.0 |
<= | Maximum version | pytest<=7.4.0 |
!= | Exclude version | numpy!=1.24.0 |
~= | Compatible release | tabulate~=0.9.0 (≥0.9.0, <0.10.0) |
> | Strictly greater | setuptools>65.0 |
< | Strictly less | urllib3<2.0 |
pip freeze always writes == — it captures what you have right now. Other operators appear in hand-written requirements files for libraries.
Parsing strategy: Try operators longest-first (~=, !=, ==, >=, <= before >, <) to avoid matching a single = inside >=. Split on the matched operator, strip whitespace. If no operator matches, the line is a bare package name.
Pitfall 1: Committing the virtual environment to Git. The .venv/ folder is hundreds of megabytes of binaries — not source code. Add it to .gitignore. Commit requirements.txt instead.
Pitfall 2: Installing globally instead of in the venv. If you pip install before activating the venv, packages go to the global Python. Check your prompt — active venvs show the environment name.
Pitfall 3: pip freeze capturing development tools. pip freeze lists everything in your environment, including pytest, linters, and Jupyter. Production deployments don't need these. Separate requirements.txt (runtime) from requirements-dev.txt (development tools).
pip-tools: Generates a locked requirements.txt from a hand-written requirements.in with flexible specifiers. Resolves the full dependency tree and pins every transitive dependency — more reproducible than pip freeze.
poetry: A full dependency manager that handles venv creation, dependency resolution, and package publishing from a single pyproject.toml. Standard in many modern Python projects.
pyproject.toml: The modern replacement for setup.py, requirements.txt, and tool configs. PEP 517/518 define it as the standard build system interface. Most new Python tooling reads from pyproject.toml.
Sign up to write and run code in this lesson.
New problem. Diane's warehouse tool depends on three third-party packages — openpyxl for Excel export, python-dateutil for delivery dates, tabulate for pretty-printing reports. You pip install them globally and everything works. Two weeks later, a colleague needs an older version of openpyxl for a different project. He upgrades it globally. Your warehouse tool breaks. What do you do?
Check which version I was using, try to pin it somehow... honestly I'd just reinstall things until it worked again. Which is a terrible answer.
It's the answer most people give before learning virtual environments. And it means you've been one pip install --upgrade away from a broken tool this entire time.
Okay. That's uncomfortable. What's the fix?
Think about how the warehouse handles Client A and Client B. Client A runs a strict inventory policy — every item logged in triplicate, restocking thresholds fixed, no exceptions. Client B is looser — spot checks, flexible thresholds. Do you run both clients out of the same warehouse with the same rules?
Never. You'd give each client their own warehouse. Different shelves, different procedures, completely isolated.
That's a virtual environment. One isolated Python installation per project. Client A's rules — its specific package versions — live in one warehouse. Client B's rules live in another. They never interfere:
# Create the isolated environment
python -m venv warehouse-env
# Activate it — rewrites PATH so 'python' and 'pip' point to the isolated copies
source warehouse-env/bin/activate # macOS/Linux
# warehouse-env\Scriptsctivate # Windows
# (your prompt shows: (warehouse-env) $ )
pip install openpyxl==3.1.2
pip install python-dateutil
pip install tabulate
# When done, exit the environment
deactivateA virtual environment is just a folder with its own site-packages directory? Not a full Python installation copy?
Almost. It points to a shared Python interpreter but has its own site-packages — its own shelf space. Creating one takes three seconds and uses almost no disk space. The packages you install inside it exist only there. Your colleague's upgrade changes his site-packages, not yours.
But when I give this project to someone else, they don't have my warehouse-env. And I shouldn't commit it to Git either, right? It's hundreds of megabytes of binaries.
Never commit the venv to Git. Add it to .gitignore. Instead you give them a requirements.txt:
# pip freeze > requirements.txtThat captures every installed package with its exact version:
openpyxl==3.1.2
python-dateutil==2.9.0
tabulate==0.9.0
et-xmlfile==1.1.0
six==1.16.0And they do pip install -r requirements.txt in their own venv and get the exact same environment?
Exactly. The -r flag means "read from file." The requirements.txt is the packing slip — it says which items, which specifications. The receiving warehouse follows the slip and stocks accordingly.
I've copy-pasted pip install -r requirements.txt from READMEs for two years without knowing what -r meant.
Welcome to understanding the commands you've been typing. Now — requirements files have version operators beyond just ==:
openpyxl==3.1.2 # exact version — this specific release only
requests>=2.28.0 # minimum — anything at or above this
tabulate~=0.9.0 # compatible release — 0.9.x only, no minor-version jumps
python-dateutil!=2.8.1 # exclusion — anything except this buggy releaseWhat does ~=0.9.0 actually mean? "Compatible release" sounds vague.
~=0.9.0 means "at least 0.9.0, but less than 0.10.0" — patches are allowed, breaking changes are not. It's shorthand for "I want bug fixes but not new APIs that might break my code." pip freeze always writes == because it captures exactly what you have. The other operators appear when humans write requirements files by hand.
So pip freeze is for locking — saying exactly what works today. The other operators are for specifying what range is acceptable. Different use cases.
That's a senior developer distinction. Locked == for applications — deploy exactly this. Flexible ranges for libraries that others will install alongside other packages.
This is the thing I've been missing. I've been treating pip as a global package manager — install things and hope for the best. Venvs make it systematic and reproducible.
Reproducible is the word. Now — parsing requirements files is a clean string processing problem. Each valid line is a package name optionally followed by a version operator and version string. The operators are ==, >=, <=, !=, ~=, >, <. Try sketching the parsing logic.
I'd split on the operator. But seven possible operators — some two characters, some one. I can't just .split("=") — that would break >=. I'd need to check for each operator, longest first.
Why longest first? What goes wrong if you check = before >=?
If I check = first on >=2.28.0, it finds the = inside >= and splits there — I lose the >. Checking two-character operators first means >= matches before anything tries to match just =.
That ordering is the whole trick. Check ~=, !=, ==, >=, <= before > and <. The rest is filtering comments and blank lines:
OPERATORS = ["~=", "!=", "==", ">=", "<=", ">", "<"]
def parse_requirements(requirements_text: str) -> list[dict]:
result = []
for line in requirements_text.splitlines():
line = line.strip()
if not line or line.startswith("#"):
continue
for op in OPERATORS:
if op in line:
name, version = line.split(op, 1)
result.append({"name": name.strip(), "operator": op, "version": version.strip()})
break
else:
result.append({"name": line, "operator": None, "version": None})
return resultThe for/else again — the else on the for loop runs if no break fired. So if no operator matched, the bare package name goes in with None values. Elegant.
Same for/else from Day 18 — "I searched the whole thing and didn't find it." This time it's "I tried all seven operators and none matched — bare name."
I remember feeling confused by for/else when I first saw it. Now I'm reaching for it naturally. Week 3 payoff.
I've been waiting for you to say that. One last thing before tomorrow. Venvs solve local isolation. They don't fully solve production deployment — that needs Docker, or pip-tools' lockfiles, or a package manager like poetry. For the inventory tool we're building, requirements.txt is exactly right. But know the ceiling exists.
So venvs are right for development and small projects. Larger production systems need something on top?
Right. And knowing the difference is what separates someone who ships code from someone who just writes it. Tomorrow is Day 28 — the capstone. Every piece from the last four weeks.
Every piece from four weeks. Tuples, sets, unpacking, function signatures, match statements, modules, packages, imports. I've been waiting for this since Day 1.
A virtual environment is an isolated Python environment with its own site-packages directory. Packages installed inside one virtual environment are invisible to other environments and to the global Python installation.
# Create a virtual environment named .venv (conventional name)
python -m venv .venv
# Activate (macOS/Linux)
source .venv/bin/activate
# Activate (Windows)
.venv\Scripts\activate
# Install packages (only into this environment)
pip install openpyxl==3.1.2 requests>=2.28.0
# Capture exact installed versions to a file
pip freeze > requirements.txt
# Restore the same environment elsewhere
pip install -r requirements.txt
# Exit the environment
deactivaterequirements.txt version operators:
| Operator | Meaning | Example |
|---|---|---|
== | Exact version | flask==2.3.0 |
>= | Minimum version | requests>=2.28.0 |
<= | Maximum version | pytest<=7.4.0 |
!= | Exclude version | numpy!=1.24.0 |
~= | Compatible release | tabulate~=0.9.0 (≥0.9.0, <0.10.0) |
> | Strictly greater | setuptools>65.0 |
< | Strictly less | urllib3<2.0 |
pip freeze always writes == — it captures what you have right now. Other operators appear in hand-written requirements files for libraries.
Parsing strategy: Try operators longest-first (~=, !=, ==, >=, <= before >, <) to avoid matching a single = inside >=. Split on the matched operator, strip whitespace. If no operator matches, the line is a bare package name.
Pitfall 1: Committing the virtual environment to Git. The .venv/ folder is hundreds of megabytes of binaries — not source code. Add it to .gitignore. Commit requirements.txt instead.
Pitfall 2: Installing globally instead of in the venv. If you pip install before activating the venv, packages go to the global Python. Check your prompt — active venvs show the environment name.
Pitfall 3: pip freeze capturing development tools. pip freeze lists everything in your environment, including pytest, linters, and Jupyter. Production deployments don't need these. Separate requirements.txt (runtime) from requirements-dev.txt (development tools).
pip-tools: Generates a locked requirements.txt from a hand-written requirements.in with flexible specifiers. Resolves the full dependency tree and pins every transitive dependency — more reproducible than pip freeze.
poetry: A full dependency manager that handles venv creation, dependency resolution, and package publishing from a single pyproject.toml. Standard in many modern Python projects.
pyproject.toml: The modern replacement for setup.py, requirements.txt, and tool configs. PEP 517/518 define it as the standard build system interface. Most new Python tooling reads from pyproject.toml.