Add enriched classification output and Todoist dedupe sync

This commit is contained in:
Steve W
2026-04-09 18:14:11 +00:00
parent cb4eb43209
commit a1dcaf9a74
8 changed files with 502 additions and 100 deletions

View File

@@ -1,21 +1,10 @@
# email-classifier
FastAPI service that classifies email using a configurable LLM backend.
## What changed
The classifier no longer hardcodes a single Ollama + OpenAI-compatible endpoint.
It now supports:
- OpenAI-compatible APIs
- Anthropic-compatible APIs
- per-request overrides for provider, model, endpoint, and temperature
- global defaults through environment variables
This makes it suitable for local Ollama, hosted OpenAI-compatible services, and MiniMax's recommended Anthropic-compatible API.
FastAPI service that classifies email using a configurable LLM backend, enriches the output for human review, and can upsert Todoist tasks without creating duplicates.
## Environment configuration
Defaults are loaded from environment variables:
LLM defaults:
```bash
export LLM_PROVIDER=openai
@@ -27,9 +16,7 @@ export LLM_TIMEOUT_SECONDS=60
export LLM_MAX_RETRIES=3
```
### MiniMax example
MiniMax recommends Anthropic-compatible integration.
MiniMax via Anthropic-compatible API:
```bash
export LLM_PROVIDER=anthropic
@@ -38,11 +25,21 @@ export LLM_API_KEY=your_minimax_key
export LLM_MODEL=MiniMax-M2.7
```
Optional Todoist sync:
```bash
export TODOIST_API_KEY=your_todoist_token
export TODOIST_PROJECT_ID=optional_project_id
export EMAIL_CLASSIFIER_DB_PATH=.data/email_classifier.db
```
## API
### POST /classify
Request body:
Backward-compatible top-level response fields are preserved.
Optional request metadata for dedupe and richer sync:
```json
{
@@ -50,16 +47,17 @@ Request body:
"subject": "Can you review this by Friday?",
"body": "Hi Daniel, please review the attached budget proposal."
},
"message_id": "<abc123@example.com>",
"thread_id": "thread-789",
"from_address": "sender@example.com",
"received_at": "2026-04-09T12:55:00Z",
"provider": "anthropic",
"base_url": "https://api.minimax.io/anthropic",
"model": "MiniMax-M2.7",
"temperature": 0.1
"model": "MiniMax-M2.7"
}
```
All override fields are optional. If omitted, the service uses the global env config.
Response shape:
Response now includes optional enrichment and Todoist sync info:
```json
{
@@ -68,21 +66,56 @@ Response shape:
"priority": "high",
"task_description": "Review the budget proposal and respond by Friday",
"reasoning": "Direct request with a deadline requires follow-up",
"confidence": 0.91
"confidence": 0.91,
"details": {
"summary": "Budget proposal review requested with Friday deadline.",
"suggested_title": "Review budget proposal and respond by Friday",
"suggested_notes": "Requester asked for feedback on attached budget proposal before Friday.",
"deadline": "Friday",
"people": ["Daniel"],
"organizations": [],
"attachments_referenced": ["budget proposal"],
"next_steps": ["Review attachment", "Reply with feedback"],
"key_points": ["Deadline is Friday"],
"source_signals": ["request", "deadline"],
"dedupe_key": "..."
},
"todoist": {
"status": "created",
"task_id": "1234567890",
"comment_added": false,
"dedupe_match": "none",
"message": null
}
}
```
## Dedupe behavior
When Todoist sync is enabled and `needs_action=true`:
- first match by `message_id`
- then by `thread_id`
- then by normalized content fingerprint fallback
Behavior:
- no existing task: create Todoist task
- existing task, same classification: do not duplicate, mark `unchanged`
- existing task, changed classification/context: update task in place
- add a Todoist comment only when material context changed
## Architecture
- `app/config.py`: global and per-request LLM settings
- `app/classifier.py`: classification orchestration and Todoist sync handoff
- `app/prompts.py`: richer extraction prompt
- `app/sync.py`: dedupe, task rendering, Todoist upsert logic
- `app/dedupe_store.py`: SQLite-backed mapping store
- `app/todoist.py`: Todoist REST client
- `app/llm_adapters.py`: provider adapters
- `app/classifier.py`: classification orchestration, retries, normalization
- `app/prompts.py`: system prompt
- `app/routers/classify_email.py`: thin API route
- `app/config.py`: LLM settings
## Notes
- OpenAI-compatible providers use the OpenAI SDK.
- Anthropic-compatible providers use the Anthropic SDK.
- Per-request `api_key` is supported, but excluded from response serialization.
- The service normalizes malformed model output and falls back safely after retry exhaustion.
- `/classify` remains backward compatible at the top level.
- New request metadata fields are optional.
- Todoist sync safely no-ops when `TODOIST_API_KEY` is not configured.
- SQLite is used for lightweight production-safe dedupe tracking.