Packages and Modules

Medium35 min read

Modules

Why Modules & Packaging Matters

The Problem: Single-file scripts cap out around 500 lines. Past that, navigation, testing, and reuse fall apart.

The Solution: Python's module system, packages, and pip turn your project into a reusable library that can be tested, versioned, published, and depended on.

Real Impact: Modern pyproject.toml + venv + uv/pip is the foundation every serious Python project rests on — get it right early and the rest of your work multiplies.

Real-World Analogy

Think of packages as a layered filing system:

  • Module = a single folder of related papers
  • Package = a labelled filing cabinet containing multiple folders
  • __init__.py = the index card at the front of the cabinet
  • pyproject.toml = the official catalog entry — what the cabinet is, what it depends on
  • Virtual environment = a desk that only sees the cabinets you brought with you

Any .py file is a module. Importing runs the file top-to-bottom once per process and caches it in sys.modules.

# math_utils.py
def square(n):
    return n * n

PI = 3.14159
# main.py
import math_utils
print(math_utils.square(5), math_utils.PI)

from math_utils import square, PI
from math_utils import square as sq      # alias

The if __name__ == "__main__" Idiom

# When run directly: __name__ is '__main__'
# When imported:     __name__ is the module name

def main():
    ...

if __name__ == "__main__":
    main()

Packages

A package is a directory containing an __init__.py file (even if empty). Sub-packages nest.

myapp/
├── __init__.py        # marks the directory as a package
├── core.py
├── cli.py
└── db/
    ├── __init__.py
    ├── models.py
    └── queries.py
# Absolute imports
from myapp.core import run
from myapp.db.models import User

# Relative imports — only allowed inside packages
from .core import run          # same package
from ..db import models         # parent package

What goes in __init__.py

# Common patterns:

# 1. Empty — just marks the package

# 2. Re-export public API
from .core import run, Pipeline
from .db.models import User
__all__ = ["run", "Pipeline", "User"]

# 3. Set package version
__version__ = "1.0.0"

The Import System

How Python Finds Modules

Python searches the directories in sys.path, in order. The first match wins.

import sys
print(sys.path)
# [ '',                              # current directory (script dir)
#   '/usr/lib/python3.12',          # standard library
#   '/usr/lib/python3.12/lib-dynload',
#   '/path/to/.venv/lib/python3.12/site-packages' ]

⚠️ Don't name modules after built-ins or stdlib

A file called random.py in your project will shadow Python's random module and cause confusing import errors. Same for email.py, json.py, logging.py, etc.

Common Import Pitfalls

pip and Virtual Environments

pip basics

$ pip install fastapi             # install latest
$ pip install "fastapi>=0.100"    # version constraint
$ pip install -r requirements.txt # from file
$ pip install -e .                 # editable install (for local packages)
$ pip list                         # show installed
$ pip show fastapi                 # show metadata
$ pip uninstall fastapi
$ pip freeze > requirements.txt    # snapshot exact versions

venv per project

$ python3 -m venv .venv
$ source .venv/bin/activate         # macOS/Linux
(.venv) $ pip install requests rich
(.venv) $ pip freeze > requirements.txt
(.venv) $ deactivate

Modern Alternative: uv

uv is a Rust-based pip+venv replacement that's 10-100× faster.

$ uv venv
$ uv pip install fastapi
$ uv pip sync requirements.txt

pyproject.toml — The Modern Standard

PEP 621 defines a unified format for project metadata, dependencies, and tool configuration. One file replaces setup.py, setup.cfg, and most tool config files.

[project]
name = "myapp"
version = "0.1.0"
description = "My awesome app"
requires-python = ">=3.10"
dependencies = [
    "fastapi>=0.100",
    "pydantic>=2",
]

[project.optional-dependencies]
dev = [
    "pytest",
    "ruff",
    "mypy",
]

[project.scripts]
myapp = "myapp.cli:main"           # installs a `myapp` shell command

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[tool.ruff]
line-length = 100

[tool.pytest.ini_options]
testpaths = ["tests"]
$ pip install -e ".[dev]"          # install package + dev extras

Publishing to PyPI

$ pip install build twine
$ python -m build                 # produces dist/*.whl and dist/*.tar.gz
$ twine upload --repository testpypi dist/*  # test first
$ twine upload dist/*              # publish to real PyPI

Test on TestPyPI first

Use https://test.pypi.org to validate your package before pushing to the real PyPI. You can't unpublish a version once it's live.

🎯 Practice Exercises

Exercise 1: Build a package

Create myapp/ with submodules core and cli. Add a __main__.py so python -m myapp works.

Exercise 2: Reproducible env

Start a project with venv + requirements.txt. Add 3 dependencies, freeze, deactivate, recreate from scratch.

Exercise 3: pyproject.toml

Convert a project with requirements.txt to pyproject.toml. Define a [project.scripts] entry point.

Exercise 4: Circular import fix

Construct a small circular-import error, then resolve it three ways: lazy import, restructure modules, or use a typing-only import.