LLM Service
Provider-agnostic LLM abstraction supporting Anthropic (Claude), OpenAI (GPT), OpenRouter (200+ models), Ollama (local), and a stub/no-op provider. SDKs are optional dependencies imported lazily.
API
```python class LLMService: def __init__(self, settings: Settings) -> None def status(self) -> dict[str, Any] # {"configured": bool, "provider": str, "model": str} async def complete(self, prompt: str, system: str | None = None, max_tokens: int = 1024) -> str async def stream(self, prompt: str, system: str | None = None) -> AsyncIterator[str] async def embed(self, texts: list[str]) -> list[list[float]] ```
Provider Mapping
| Provider | SDK | How | Embeddings | |----------|-----|-----|------------| | `anthropic` | `anthropic` | Direct SDK | Not supported (returns empty) | | `openai` | `openai` | Direct SDK | text-embedding-3-small | | OpenRouter | `openai` | `base_url="https://openrouter.ai/api/v1"` | Depends on model | | Ollama | `openai` | `base_url="http://localhost:11434/v1"` | Depends on model | | `stub`/`none` | — | No-op, returns empty strings | Returns empty vectors |
Configuration
Two layers, DB settings take priority over config file:
1. DB settings (via Settings page UI): `ai.provider`, `ai.apiKey`, `ai.model`, `ai.baseUrl` 2. Config file (`~/.pyrite/config.yaml`): `ai_provider`, `ai_api_key`, `ai_model`, `ai_api_base` 3. Environment variables: `ANTHROPIC_API_KEY`, `OPENAI_API_KEY`, `OPENAI_API_BASE`
DI Integration
The REST API exposes `get_llm_service()` in `api.py` which:
```python from ..api import get_llm_service llm: LLMService = Depends(get_llm_service) ```