Technical
Python Project Structure
Standardized Python project structure with uv, Taskfile, direnv, and consistent tooling.
Production
Problem
Python projects lack consistent structure across Ontopix repositories, leading to:
- Onboarding friction: Developers must learn each project's unique structure
- AI agent confusion: Agents cannot reliably navigate unfamiliar project layouts
- Environment management inconsistency: Different projects use different venv strategies
- CI/CD fragmentation: Duplicated workflow logic instead of standardized patterns
- Missing contracts: No guaranteed operational interface across Python projects
Context
Use this pattern when:
- Creating a new Python project (library, application, or CLI tool)
- Migrating existing Python projects to Ontopix standards
- Working in monorepo or standalone repository contexts
- Building applications, libraries, or MCP servers in Python
Don't use when:
- One-off scripts that don't warrant project structure
- Projects with strong external framework conventions (e.g., Django projects may adapt)
- Non-Python projects
Solution
Every Python project MUST follow a standardized structure with:
- uv for dependency management, virtual environments, and Python version management
- Taskfile as the operational contract
- direnv for automatic environment activation
- ruff + mypy for linting and type checking
- Standardized directory layout for sources, tests, and configuration
Directory Structure
my-project/
├── src/ # Source code (src layout)
│ └── my_package/ # Main package
│ ├── __init__.py
│ ├── _internal.py # Internal modules (underscore prefix)
│ └── public_api.py # Public API modules
├── tests/ # Test suite
│ ├── unit/ # Unit tests
│ ├── integration/ # Integration tests (optional)
│ ├── e2e/ # End-to-end tests (optional)
│ ├── fixtures/ # Reusable test data
│ └── conftest.py # Shared pytest fixtures
├── pyproject.toml # PEP 621 project metadata + tool configuration
├── uv.lock # Locked dependencies
├── Taskfile.yaml # Operational contract
├── .env.example # Environment variable template
├── .python-version # Python version pin (managed by uv)
# Local files (gitignored, not committed):
# ├── .envrc # direnv environment activation (local)
├── .gitignore # Git ignore patterns
├── README.md # Human documentation
├── AGENTS.md # AI agent entrypoint
└── .venv/ # Virtual environment (gitignored)
Project Type Variations
Library (Publishable Package)
my-library/
├── src/
│ └── my_library/ # Package name matches distribution
│ ├── __init__.py # Public API exports via __all__
│ ├── models.py # Public: Data models
│ ├── client.py # Public: Client interface
│ └── _helpers.py # Private: Internal utilities
├── tests/
│ ├── unit/
│ ├── integration/
│ └── e2e/ # End-to-end tests (optional)
├── pyproject.toml # [project.scripts] optional
└── ...
Library-specific considerations:
- Use
src/layout to prevent import confusion during development - Export public API explicitly via
__all__in__init__.py - Follow Python Module Naming for internal modules
Application (CLI or Service)
my-app/
├── src/
│ └── my_app/
│ ├── __init__.py
│ ├── cli/ # CLI entrypoints (if CLI app)
│ │ ├── __init__.py
│ │ └── main.py # Typer/Click app
│ ├── core/ # Business logic
│ └── _internal/ # Internal utilities
├── tests/
│ ├── unit/
│ ├── integration/
│ └── e2e/ # End-to-end tests (optional)
├── .infra/ # Infrastructure (if applicable)
├── pyproject.toml # [project.scripts] required for CLI
└── ...
Application-specific considerations:
- CLI apps MUST define entry point in
[project.scripts] - Services MAY include
.infra/directory for Terraform deployments - Follow Infrastructure Organization for deployment
Implementation
Step 1: Initialize Project with uv
# Create project directory
mkdir my-project && cd my-project
# Initialize with uv (creates pyproject.toml + src layout)
uv init --lib --name my-package --python ">=3.11"
# Pin Python version
uv python pin 3.11
# Create test directories
mkdir -p tests/{unit,integration,fixtures}
touch tests/conftest.py
Step 2: Configure pyproject.toml
[project]
name = "my-package"
version = "0.1.0"
description = "Package description"
readme = "README.md"
authors = [
{ name = "Ontopix Engineering", email = "engineering@ontopix.com" },
]
license = { text = "MIT" }
requires-python = ">=3.11"
dependencies = [
# Add project dependencies here
]
# For CLI applications only:
# [project.scripts]
# my-command = "my_package.cli.main:app"
[dependency-groups]
dev = [
"pytest>=8.0",
"pytest-cov>=4.0",
"ruff>=0.4",
"mypy>=1.10",
]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/my_package"]
# ─────────────────────────────────────────────────────────────
# Tool Configuration
# ─────────────────────────────────────────────────────────────
[tool.ruff]
target-version = "py311"
line-length = 100
src = ["src", "tests"]
[tool.ruff.lint]
select = [
"E", # pycodestyle errors
"W", # pycodestyle warnings
"F", # Pyflakes
"I", # isort
"B", # flake8-bugbear
"C4", # flake8-comprehensions
"UP", # pyupgrade
"ARG", # flake8-unused-arguments
"SIM", # flake8-simplify
]
ignore = [
"E501", # line too long (handled by formatter)
"B008", # function calls in defaults (common in FastAPI/Typer)
]
[tool.ruff.lint.isort]
known-first-party = ["my_package"]
[tool.mypy]
python_version = "3.11"
strict = true
warn_return_any = true
warn_unused_ignores = true
disallow_untyped_defs = true
[tool.pytest.ini_options]
testpaths = ["tests"]
pythonpath = ["src"]
addopts = "-v --tb=short"
markers = [
"integration: marks tests as integration tests (deselect with '-m \"not integration\"')",
]
[tool.coverage.run]
source = ["src/my_package"]
branch = true
[tool.coverage.report]
exclude_lines = [
"pragma: no cover",
"if TYPE_CHECKING:",
"raise NotImplementedError",
]
Step 3: Create Taskfile.yaml
version: '3'
# Python Project - Following Ontopix Taskfile Contract
vars:
PACKAGE: my_package
tasks:
default:
desc: Show available tasks
cmds:
- task --list
# ─────────────────────────────────────────────────────────────
# Development Operations (dev:*)
# ─────────────────────────────────────────────────────────────
dev:install:
desc: Install dependencies with uv
cmds:
- uv sync
status:
- test -d .venv
dev:shell:
desc: Activate virtual environment
cmds:
- echo "Run 'source .venv/bin/activate' or use direnv"
dev:run:
desc: Run application (customize as needed)
cmds:
- uv run python -m {{.PACKAGE}}
# For CLI apps: uv run my-command --help
dev:update:
desc: Update dependencies to latest compatible versions
cmds:
- uv lock --upgrade
- uv sync
# ─────────────────────────────────────────────────────────────
# Testing Operations (test:*)
# ─────────────────────────────────────────────────────────────
test:all:
desc: Run all tests
cmds:
- uv run pytest
test:unit:
desc: Run unit tests only (fast)
cmds:
- uv run pytest tests/unit -m "not integration"
test:integration:
desc: Run integration tests (may require external services)
cmds:
- uv run pytest tests/integration -m integration
preconditions:
- sh: "test -f .env || test -n \"$CI\""
msg: "Create .env file or run in CI environment"
test:coverage:
desc: Run tests with coverage report
cmds:
- uv run pytest --cov={{.PACKAGE}} --cov-report=term-missing --cov-report=html
test:single:
desc: "Run a single test file (usage: task test:single TEST=test_example.py)"
cmds:
- uv run pytest tests/{{.TEST}} -v
preconditions:
- sh: test -n "{{.TEST}}"
msg: "TEST variable required (e.g., TEST=unit/test_example.py)"
# ─────────────────────────────────────────────────────────────
# Linting Operations (lint:*)
# ─────────────────────────────────────────────────────────────
lint:check:
desc: Check code quality without fixing
cmds:
- uv run ruff check src tests
lint:fix:
desc: Fix auto-fixable lint issues
cmds:
- uv run ruff check --fix src tests
lint:format:
desc: Format code with ruff formatter
cmds:
- uv run ruff format src tests
lint:typecheck:
desc: Run mypy type checker
cmds:
- uv run mypy src
lint:all:
desc: Run all linting checks
cmds:
- task: lint:check
- task: lint:typecheck
# ─────────────────────────────────────────────────────────────
# Validation Operations (validate:*)
# ─────────────────────────────────────────────────────────────
validate:all:
desc: Run all checks (lint + typecheck + tests)
cmds:
- task: lint:all
- task: test:all
ci:
desc: Run CI pipeline locally (same as GitHub Actions)
cmds:
- task: lint:check
- task: lint:typecheck
- task: test:all
# ─────────────────────────────────────────────────────────────
# Build Operations (build:*)
# ─────────────────────────────────────────────────────────────
build:dist:
desc: Build distribution packages (wheel + sdist)
cmds:
- uv build
build:docker:
desc: Build Docker image (if applicable)
cmds:
- docker build -t {{.PACKAGE}}:latest -f .infra/docker/Dockerfile .
preconditions:
- sh: test -f .infra/docker/Dockerfile
msg: "Dockerfile not found at .infra/docker/Dockerfile"
# ─────────────────────────────────────────────────────────────
# Utility Tasks
# ─────────────────────────────────────────────────────────────
clean:
desc: Clean build artifacts and caches
cmds:
- rm -rf dist build *.egg-info .pytest_cache .mypy_cache .ruff_cache .coverage htmlcov
- find . -type d -name __pycache__ -exec rm -rf {} + 2>/dev/null || true
info:
desc: Show project information
cmds:
- echo "Package: {{.PACKAGE}}"
- uv run python --version
- echo "Lock file: $(test -f uv.lock && echo 'present' || echo 'missing')"
Step 4: Configure Environment Files
.envrc (local developer file, gitignored - for direnv activation):
# Load environment variables from .env file if it exists
dotenv_if_exists .env
# Activate uv virtual environment if it exists
if [[ -f .venv/bin/activate ]]; then
source .venv/bin/activate
fi
.env.example (template for required variables):
# My Package Environment Variables
# Copy to .env and fill in values
# Required
# MY_API_KEY=your_api_key_here
# Optional
# MY_API_URL=https://api.example.com
# DEBUG=false
.gitignore (Python-specific):
# Virtual Environment
.venv/
# Python
__pycache__/
*.py[cod]
*$py.class
*.so
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
*.egg-info/
.installed.cfg
*.egg
# Testing
.pytest_cache/
.coverage
htmlcov/
.tox/
.nox/
# Type checking
.mypy_cache/
# Linting
.ruff_cache/
# IDE
.idea/
.vscode/
*.swp
*.swo
# Environment
.env
.envrc
# Local developer files
.local/
Step 5: Set Up CI Workflow
Create .github/workflows/ci.yml:
name: CI
on:
push:
branches: [master]
paths:
- 'src/**'
- 'tests/**'
- 'pyproject.toml'
- 'uv.lock'
- '.github/workflows/ci.yml'
pull_request:
paths:
- 'src/**'
- 'tests/**'
- 'pyproject.toml'
- 'uv.lock'
- '.github/workflows/ci.yml'
workflow_dispatch:
jobs:
lint-and-test:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Install uv
uses: astral-sh/setup-uv@v5
- name: Install Task
uses: arduino/setup-task@v2
with:
version: '3.x'
- name: Install dependencies
run: task dev:install
- name: Lint check
run: task lint:check
- name: Type check
run: task lint:typecheck
- name: Run tests
run: task test:all
Step 6: Create AGENTS.md
# [Project Name] - Agent Context
## Project Overview
Brief description of what this project does.
## Development Setup
### Prerequisites
- Python 3.11+ (managed by uv)
- uv (`curl -LsSf https://astral.sh/uv/install.sh | sh`)
- direnv (`brew install direnv`)
- Task (`brew install go-task/tap/go-task`)
### First-Time Setup
```bash
# Enter directory (direnv activates automatically if configured)
cd my-project
# Copy environment template
cp .env.example .env
# Allow direnv
direnv allow
# Install dependencies (also installs Python if needed)
task dev:install
Development Patterns
- uv: Dependency management, venv in
.venv/, Python version management - direnv: Environment auto-activates via
.envrc - Taskfile: Namespaced tasks following Ontopix contract
Taskfile Commands (Ontopix Contract)
Development
task dev:install # Install dependencies with uv
task dev:run # Run application
Testing
task test:all # Run all tests
task test:unit # Run unit tests only
task test:coverage # Run with coverage report
Linting
task lint:check # Check code quality
task lint:fix # Fix auto-fixable issues
task lint:typecheck # Run mypy type checker
task lint:all # Run all linting checks
Validation
task validate:all # Run all checks (lint + tests)
task ci # Run CI pipeline locally
Key Files
| File | Purpose |
|---|---|
src/my_package/ | Main package source code |
tests/unit/ | Unit tests |
tests/integration/ | Integration tests |
tests/fixtures/ | Reusable test data |
Environment Variables
| Variable | Required | Purpose |
|---|---|---|
MY_API_KEY | Yes | API authentication |
Related Patterns
## Monorepo Considerations
When using this pattern in a monorepo, uv workspaces provide native support for multi-project setups.
### Workspace Configuration
In the root `pyproject.toml`:
```toml
[tool.uv.workspace]
members = [
"projects/agent/*",
"projects/client/*",
"projects/mcp/*",
]
Project Placement
monorepo/
├── projects/
│ ├── agent/ # AI agents
│ │ └── my-agent/ # Each agent is a Python project
│ ├── client/ # API clients
│ │ └── my-client/ # Each client is a Python project
│ └── mcp/ # MCP servers
│ └── my-mcp/ # Each MCP server is a Python project
├── pyproject.toml # Workspace definition
├── uv.lock # Single lock file for the workspace
├── Taskfile.yaml # Root orchestration only
├── AGENTS.md # Points to project-specific AGENTS.md
└── .github/workflows/
└── ci-my-project.yml # Per-project CI with path triggers
CI with Path-Based Triggers
# .github/workflows/ci-my-project.yml
on:
push:
paths:
- 'projects/client/my-project/**'
- '.github/workflows/ci-my-project.yml'
pull_request:
paths:
- 'projects/client/my-project/**'
jobs:
test:
runs-on: ubuntu-latest
defaults:
run:
working-directory: projects/client/my-project
# ... rest of workflow
Applies Principles
- Consistency: Same structure across all Python projects
- Operational Contract: Taskfile provides discoverable interface
- AI Agent Compatibility: AGENTS.md enables automated operations
- Local-First Development: In-project venv with direnv activation
- CI/CD Alignment: Same commands locally and in CI
Consequences
Benefits
- Immediate operability:
task --listshows all operations - Consistent developer experience: Same patterns everywhere
- AI agent friendly: Standard structure enables automation
- Fast environment setup: direnv + in-project venv
- Type safety: mypy strict mode by default
- Code quality: ruff catches common issues
- Fast CI: uv resolves and installs dependencies 10-100x faster than Poetry
- Native monorepo support:
[tool.uv.workspace]eliminates path dependency workarounds - Standards compliance: PEP 621 metadata is portable across tools
Trade-offs
- Tool prescription: uv/ruff/mypy required
- Initial setup cost: More boilerplate than minimal projects
- src layout: Slightly more nesting than flat layout
Migration from Poetry
For projects currently using Poetry:
- Convert
[tool.poetry]metadata to PEP 621[project]section - Convert
[tool.poetry.dependencies]todependenciesarray - Convert
[tool.poetry.group.dev.dependencies]to[dependency-groups] - Replace
poetry.lockwithuv.lock(runuv lock) - Update Taskfile commands from
poetry runtouv run - Update CI workflows to install uv instead of Poetry
- Remove
poetry.tomland Poetry configuration files
When to Adapt
- Django projects: Follow Django conventions, adapt Taskfile tasks
- FastAPI services: Add
dev:servetask, adjust structure for routes - Lambda functions: May use simplified structure without src/ layout
Related Patterns
- Python Module Naming - Internal module conventions
- Python Test Organization - Test directory structure
- Taskfile as Contract - Operational interface
- Repository Structure - Required files
- GitHub Actions Workflows - CI/CD patterns
- AWS CodeArtifact with GitHub Actions - Publishing
References
- ADR-0008 — Python Dependency Management with uv — decision record
- uv Documentation
- PEP 621 — Storing project metadata in pyproject.toml
- Taskfile Documentation
- ruff Documentation
- mypy Documentation
- direnv Documentation
Last Updated: 2026-03-13
Applies to: All Python projects (libraries, applications, CLI tools)