Files
email-classifier/docs/setup.md
Lennie S. 760b56bfd6 Add MkDocs documentation
Covers: overview, setup, API reference, configuration,
testing, deployment, and known quirks.
2026-04-09 20:24:49 +00:00

1.5 KiB

Setup & Installation

Prerequisites

  • Python 3.12+
  • uv package manager
  • An LLM backend (Ollama, LM Studio, MiniMax, OpenAI, or any OpenAI/Anthropic-compatible API)

Quick Start

# Clone the repository
git clone https://git.danhenry.dev/daniel/email-classifier.git
cd email-classifier

# Install dependencies
uv sync

# Start the server
uv run uvicorn app.main:app --host 0.0.0.0 --port 7999

The API will be available at http://localhost:7999. Auto-generated API docs are at http://localhost:7999/docs (Swagger UI) and http://localhost:7999/redoc.

Environment Variables

The service is configured entirely through environment variables. See Configuration for the full reference.

A minimal .env file for local development with Ollama:

LLM_PROVIDER=openai
LLM_BASE_URL=http://localhost:11434/v1
LLM_API_KEY=none
LLM_MODEL=qwen2.5-7b-instruct.q4_k_m
LLM_TEMPERATURE=0.1

Using Docker

# Build the image
docker build -t email-classifier .

# Run the container
docker run -p 7999:7999 \
  -e LLM_PROVIDER=openai \
  -e LLM_BASE_URL=http://host.docker.internal:11434/v1 \
  -e LLM_API_KEY=none \
  -e LLM_MODEL=qwen2.5-7b-instruct.q4_k_m \
  email-classifier

Dependency Management

This project uses uv for dependency management. Do not use pip directly.

# Add a new dependency
uv add <package>

# Sync dependencies (after pulling changes)
uv sync

# Run with uv (recommended)
uv run uvicorn app.main:app --reload