You've got tuples, sets, flexible functions, pattern matching — the whole toolkit. But the inventory script is 312 lines now. What happens when you search for the normalize function?
I scroll for three minutes and still can't find it. I know I wrote it. It's somewhere in there mixed in with the menu logic and the report formatter and four helper functions that help the other helpers.
You've hit the wall. Not a bug — code that outgrew its container. One enormous room with no doors. The solution is to stop thinking in scripts and start thinking in modules.
So instead of one 312-line file, I'd split by concern? Data loading in one file, reporting in another?
Exactly. And Python's import system is how those files talk to each other. When you write import inventory_loader, Python runs that file, makes its functions available, and caches it so running it twice doesn't double-execute anything. That's week four day one.
What about the if __name__ == '__main__' thing? I've seen it everywhere but I never fully understood it.
That's day two. It's the pattern that lets a file be both a reusable module and a standalone script at the same time — the file knows whether someone ran it directly or imported it from elsewhere. By the end of the week you'll have a proper package structure for the inventory tool.
With real folders and __init__.py and everything?
And a virtual environment so your dependencies don't bleed into other projects. You would not have asked that question four weeks ago.
Every experienced Python developer has written a script that eventually became unmaintainable. The script grew because it kept working — each new feature got appended rather than organized, and at some point the file became a liability. Week four is about escaping that pattern before it becomes permanent.
Modules are ordinary Python files. When you write import utils, Python finds utils.py, executes every top-level statement in it immediately, and makes its names available under the utils. prefix. Execution happens once — subsequent imports use Python's module cache. This means top-level side effects (prints, file reads, network calls) run at import time, which is why production modules keep their top level clean: only definitions, no actions.
from module import name and import module as alias give you fine-grained control over what you pull in and how you name it. Both work. The difference is clarity vs. brevity: from reporter import generate_report removes the prefix, import analyzer as az shortens a long name, and combining both in one file is perfectly valid. Python doesn't care which form you use — style guides have opinions but none of them are enforced by the interpreter.
if __name__ == '__main__' is the dual-mode pattern. Python sets __name__ to '__main__' when a file is run directly and to the module's name when it's imported. Wrapping your script's entry point in this check makes the file safe to import: another module can pull in your functions without triggering your menu loop or your test suite.
Packages are directories with an __init__.py file. They let you organize related modules under a single importable name — from inventory_tool import loader rather than a flat file. The __init__.py runs when the package is first imported and can expose a curated public API, hiding internal implementation details from callers.
Virtual environments and pip close the week. A virtual environment is an isolated Python installation for a single project. Packages installed with pip go into the venv, not the system Python — so your inventory tool's dependencies don't conflict with other projects on your machine. requirements.txt documents what the project needs so any team member can recreate the environment with one command. This is the infrastructure every professional Python project runs on.
Sign up to save your notes.
You've got tuples, sets, flexible functions, pattern matching — the whole toolkit. But the inventory script is 312 lines now. What happens when you search for the normalize function?
I scroll for three minutes and still can't find it. I know I wrote it. It's somewhere in there mixed in with the menu logic and the report formatter and four helper functions that help the other helpers.
You've hit the wall. Not a bug — code that outgrew its container. One enormous room with no doors. The solution is to stop thinking in scripts and start thinking in modules.
So instead of one 312-line file, I'd split by concern? Data loading in one file, reporting in another?
Exactly. And Python's import system is how those files talk to each other. When you write import inventory_loader, Python runs that file, makes its functions available, and caches it so running it twice doesn't double-execute anything. That's week four day one.
What about the if __name__ == '__main__' thing? I've seen it everywhere but I never fully understood it.
That's day two. It's the pattern that lets a file be both a reusable module and a standalone script at the same time — the file knows whether someone ran it directly or imported it from elsewhere. By the end of the week you'll have a proper package structure for the inventory tool.
With real folders and __init__.py and everything?
And a virtual environment so your dependencies don't bleed into other projects. You would not have asked that question four weeks ago.
Every experienced Python developer has written a script that eventually became unmaintainable. The script grew because it kept working — each new feature got appended rather than organized, and at some point the file became a liability. Week four is about escaping that pattern before it becomes permanent.
Modules are ordinary Python files. When you write import utils, Python finds utils.py, executes every top-level statement in it immediately, and makes its names available under the utils. prefix. Execution happens once — subsequent imports use Python's module cache. This means top-level side effects (prints, file reads, network calls) run at import time, which is why production modules keep their top level clean: only definitions, no actions.
from module import name and import module as alias give you fine-grained control over what you pull in and how you name it. Both work. The difference is clarity vs. brevity: from reporter import generate_report removes the prefix, import analyzer as az shortens a long name, and combining both in one file is perfectly valid. Python doesn't care which form you use — style guides have opinions but none of them are enforced by the interpreter.
if __name__ == '__main__' is the dual-mode pattern. Python sets __name__ to '__main__' when a file is run directly and to the module's name when it's imported. Wrapping your script's entry point in this check makes the file safe to import: another module can pull in your functions without triggering your menu loop or your test suite.
Packages are directories with an __init__.py file. They let you organize related modules under a single importable name — from inventory_tool import loader rather than a flat file. The __init__.py runs when the package is first imported and can expose a curated public API, hiding internal implementation details from callers.
Virtual environments and pip close the week. A virtual environment is an isolated Python installation for a single project. Packages installed with pip go into the venv, not the system Python — so your inventory tool's dependencies don't conflict with other projects on your machine. requirements.txt documents what the project needs so any team member can recreate the environment with one command. This is the infrastructure every professional Python project runs on.