You've been writing import json and import csv since the first day you touched warehouse data. What do you think actually happens when Python runs that line?
Python loads the module? It becomes available under that name? I've never thought about the mechanics. I just know it works.
That's more honest than most people admit. Let me give you the mental model. Think about Diane's supplier catalog — the big binder in the front office with every approved vendor and every product they stock.
The one nobody updates but everyone uses when a new shipment question comes up?
The very one. You don't manufacture everything the warehouse sells. When you need zip ties, you don't set up a zip-tie production line — you look up the supplier in the catalog and order them. import is exactly that. You're ordering json from the Python standard library's catalog.
So import json is calling the supplier and saying "send me the json module." And json becomes a name I can use — like a local variable pointing at the whole module.
Exactly right. json in your local scope is a module object. It has attributes — json.loads, json.dumps, json.JSONDecodeError — and you access them with dot notation the same as any other attribute access in Python.
A module is just an object? Like every other object in Python?
Every other object. A module is an instance of ModuleType. Its attributes are whatever names are defined in that file. When you write import json, Python finds the file, runs it, and hands you back the resulting module object. Which means you can inspect it like anything else:
import json
print(type(json)) # <class 'module'>
print(json.__name__) # json
print(json.__file__) # /path/to/json/__init__.py
print(dir(json)) # all names defined insideCalling dir() on a module. That would have saved me so many trips to the documentation — I could just see what's available at the REPL.
One of the most useful REPL tricks there is. Now — the catalog has subfolders. Big suppliers organize their products: electronics in section 3A, fasteners in 7B. Python packages work the same way. The dot in import os.path means "inside the os package, give me the path submodule."
So os is a package — a folder — and path is a module inside it?
A package is a directory with an __init__.py file in it. The dot in os.path is a path separator — folder navigation, not method access. Same dot you use to access attributes in regular code, but here it's navigating the file system hierarchy. Two different jobs, identical syntax.
In import os.path the dots navigate the file system. In os.path.join("a", "b") after the import, the dots are attribute lookups. Same character, different meaning depending on context.
That sharp distinction is what most people never articulate. Now — three import forms you'll see everywhere, with real differences:
# Form 1: import the module, access via its name
import json
data = json.loads('{"sku": "SKU-1001"}')
# Form 2: import the module under an alias
import json as j
data = j.loads('{"sku": "SKU-1001"}')
# Form 3: import a specific name from the module
from json import loads
data = loads('{"sku": "SKU-1001"}')Form 1 is the safest — you always know where loads came from. Form 3 is the shortest — but if you import loads from multiple modules you'll have a collision.
from X import Y puts Y directly in your local namespace. Two modules both exporting loads — the second import wins and silently overwrites the first. Form 1 avoids this entirely because the module name is the namespace. Form 3 is fine for commonly understood names — from typing import Optional won't confuse anyone.
What about aliases? I've seen import numpy as np and import pandas as pd everywhere.
Convention for heavy-use libraries with long names. numpy is six keystrokes you'd type hundreds of times a day, so the community standardized np. The alias is just a name — you could write import json as banana and Python wouldn't care.
I'm writing import json as banana as soon as this session ends.
Put it in production and I will disavow knowing you. Now — how Python decides where to look for a module:
import sys
print(sys.path) # ordered list of directories Python searchesA search path? It looks in each directory in order until it finds a match?
First match wins. Search order: current working directory, PYTHONPATH environment variable, standard library directories, site-packages — where third-party installs like requests or pandas live. import json finds it in the standard library. import my_inventory_module finds a file called my_inventory_module.py in the current directory.
So I could create inventory_utils.py in my project folder, write my functions in it, and just do import inventory_utils? No configuration, no registry — just the file?
Just the file. The filename is the module name. That's the entire mechanic. Professional projects add packages and virtual environments for organization, but the fundamental rule is: file exists, file is importable. You've been writing modules every time you created a .py file — you just hadn't thought of them that way.
The Monday report script was a module. I could have imported its functions from another script instead of copy-pasting them.
You absolutely could have. And that's exactly where Week 4 is heading — organizing real projects across multiple files, with clear responsibilities, so the 500-line blob never happens.
Including protecting future-me from past-me's code. Past-me writes some interesting things.
Tomorrow you'll see something in every Python script you've ever opened but never questioned: if __name__ == "__main__":. After today — after understanding that a module is just a file Python runs — you're ready to understand exactly what problem that conditional solves.
I've seen that pattern dozens of times. I always skipped past it. I assumed it was required boilerplate that I didn't need to understand.
Boilerplate, yes. But boilerplate that does something specific. Tomorrow's the day you stop skipping it.
When Python executes import json, it:
sys.path (an ordered list of directories) for a file named json.py or a package named json/A module is an ordinary Python object — an instance of types.ModuleType. You can inspect it like any other object:
import json
print(type(json)) # <class 'module'>
print(json.__name__) # json
print(json.__file__) # /path/to/json/__init__.py
print(dir(json)) # all names defined inside the moduleThree import forms:
import json # module object bound to name 'json'
import json as j # module object bound to alias 'j'
from json import loads # attribute 'loads' pulled into local namespace
from json import loads, dumps # multiple names at onceDot notation in imports vs regular code:
import os.path — dots are path separators navigating the package hierarchy (file system)os.path.join("a", "b") — dots are attribute lookups on objectsThese are visually identical but semantically different.
Module search order (sys.path):
A file named my_module.py in the current directory is importable as import my_module. No registration required — the filename is the module name.
Pitfall 1: from X import Y namespace collision. If two modules export the same name, the second from import silently overwrites the first. Use import X and access names as X.Y to avoid this.
Pitfall 2: Circular imports. Module A imports from module B, module B imports from module A. Both fail to load completely. Restructure by extracting shared code to a third module, or delay one import to function scope.
Pitfall 3: Shadowing standard library modules. A file named json.py in your project directory shadows the standard library json. Python finds yours first via sys.path. Name project files descriptively to avoid this.
sys.modules cache: After a module is imported once, Python stores it in sys.modules. Subsequent import statements return the cached object instantly — modules are not re-executed. importlib.reload(module) forces a re-execution when needed.
__all__: A list defined at module level that controls what from X import * exports. Without it, import * brings in everything not prefixed with _. With it, only listed names are exported.
Relative imports: Inside a package, from . import utils imports utils.py from the same directory. from .. import config imports from the parent package. Relative imports prevent accidental shadowing and make package internals self-contained.
Sign up to write and run code in this lesson.
You've been writing import json and import csv since the first day you touched warehouse data. What do you think actually happens when Python runs that line?
Python loads the module? It becomes available under that name? I've never thought about the mechanics. I just know it works.
That's more honest than most people admit. Let me give you the mental model. Think about Diane's supplier catalog — the big binder in the front office with every approved vendor and every product they stock.
The one nobody updates but everyone uses when a new shipment question comes up?
The very one. You don't manufacture everything the warehouse sells. When you need zip ties, you don't set up a zip-tie production line — you look up the supplier in the catalog and order them. import is exactly that. You're ordering json from the Python standard library's catalog.
So import json is calling the supplier and saying "send me the json module." And json becomes a name I can use — like a local variable pointing at the whole module.
Exactly right. json in your local scope is a module object. It has attributes — json.loads, json.dumps, json.JSONDecodeError — and you access them with dot notation the same as any other attribute access in Python.
A module is just an object? Like every other object in Python?
Every other object. A module is an instance of ModuleType. Its attributes are whatever names are defined in that file. When you write import json, Python finds the file, runs it, and hands you back the resulting module object. Which means you can inspect it like anything else:
import json
print(type(json)) # <class 'module'>
print(json.__name__) # json
print(json.__file__) # /path/to/json/__init__.py
print(dir(json)) # all names defined insideCalling dir() on a module. That would have saved me so many trips to the documentation — I could just see what's available at the REPL.
One of the most useful REPL tricks there is. Now — the catalog has subfolders. Big suppliers organize their products: electronics in section 3A, fasteners in 7B. Python packages work the same way. The dot in import os.path means "inside the os package, give me the path submodule."
So os is a package — a folder — and path is a module inside it?
A package is a directory with an __init__.py file in it. The dot in os.path is a path separator — folder navigation, not method access. Same dot you use to access attributes in regular code, but here it's navigating the file system hierarchy. Two different jobs, identical syntax.
In import os.path the dots navigate the file system. In os.path.join("a", "b") after the import, the dots are attribute lookups. Same character, different meaning depending on context.
That sharp distinction is what most people never articulate. Now — three import forms you'll see everywhere, with real differences:
# Form 1: import the module, access via its name
import json
data = json.loads('{"sku": "SKU-1001"}')
# Form 2: import the module under an alias
import json as j
data = j.loads('{"sku": "SKU-1001"}')
# Form 3: import a specific name from the module
from json import loads
data = loads('{"sku": "SKU-1001"}')Form 1 is the safest — you always know where loads came from. Form 3 is the shortest — but if you import loads from multiple modules you'll have a collision.
from X import Y puts Y directly in your local namespace. Two modules both exporting loads — the second import wins and silently overwrites the first. Form 1 avoids this entirely because the module name is the namespace. Form 3 is fine for commonly understood names — from typing import Optional won't confuse anyone.
What about aliases? I've seen import numpy as np and import pandas as pd everywhere.
Convention for heavy-use libraries with long names. numpy is six keystrokes you'd type hundreds of times a day, so the community standardized np. The alias is just a name — you could write import json as banana and Python wouldn't care.
I'm writing import json as banana as soon as this session ends.
Put it in production and I will disavow knowing you. Now — how Python decides where to look for a module:
import sys
print(sys.path) # ordered list of directories Python searchesA search path? It looks in each directory in order until it finds a match?
First match wins. Search order: current working directory, PYTHONPATH environment variable, standard library directories, site-packages — where third-party installs like requests or pandas live. import json finds it in the standard library. import my_inventory_module finds a file called my_inventory_module.py in the current directory.
So I could create inventory_utils.py in my project folder, write my functions in it, and just do import inventory_utils? No configuration, no registry — just the file?
Just the file. The filename is the module name. That's the entire mechanic. Professional projects add packages and virtual environments for organization, but the fundamental rule is: file exists, file is importable. You've been writing modules every time you created a .py file — you just hadn't thought of them that way.
The Monday report script was a module. I could have imported its functions from another script instead of copy-pasting them.
You absolutely could have. And that's exactly where Week 4 is heading — organizing real projects across multiple files, with clear responsibilities, so the 500-line blob never happens.
Including protecting future-me from past-me's code. Past-me writes some interesting things.
Tomorrow you'll see something in every Python script you've ever opened but never questioned: if __name__ == "__main__":. After today — after understanding that a module is just a file Python runs — you're ready to understand exactly what problem that conditional solves.
I've seen that pattern dozens of times. I always skipped past it. I assumed it was required boilerplate that I didn't need to understand.
Boilerplate, yes. But boilerplate that does something specific. Tomorrow's the day you stop skipping it.
When Python executes import json, it:
sys.path (an ordered list of directories) for a file named json.py or a package named json/A module is an ordinary Python object — an instance of types.ModuleType. You can inspect it like any other object:
import json
print(type(json)) # <class 'module'>
print(json.__name__) # json
print(json.__file__) # /path/to/json/__init__.py
print(dir(json)) # all names defined inside the moduleThree import forms:
import json # module object bound to name 'json'
import json as j # module object bound to alias 'j'
from json import loads # attribute 'loads' pulled into local namespace
from json import loads, dumps # multiple names at onceDot notation in imports vs regular code:
import os.path — dots are path separators navigating the package hierarchy (file system)os.path.join("a", "b") — dots are attribute lookups on objectsThese are visually identical but semantically different.
Module search order (sys.path):
A file named my_module.py in the current directory is importable as import my_module. No registration required — the filename is the module name.
Pitfall 1: from X import Y namespace collision. If two modules export the same name, the second from import silently overwrites the first. Use import X and access names as X.Y to avoid this.
Pitfall 2: Circular imports. Module A imports from module B, module B imports from module A. Both fail to load completely. Restructure by extracting shared code to a third module, or delay one import to function scope.
Pitfall 3: Shadowing standard library modules. A file named json.py in your project directory shadows the standard library json. Python finds yours first via sys.path. Name project files descriptively to avoid this.
sys.modules cache: After a module is imported once, Python stores it in sys.modules. Subsequent import statements return the cached object instantly — modules are not re-executed. importlib.reload(module) forces a re-execution when needed.
__all__: A list defined at module level that controls what from X import * exports. Without it, import * brings in everything not prefixed with _. With it, only listed names are exported.
Relative imports: Inside a package, from . import utils imports utils.py from the same directory. from .. import config imports from the parent package. Relative imports prevent accidental shadowing and make package internals self-contained.