Replace narrow `QueryExpansionService` with a general-purpose `LLMService` in `pyrite/services/llm_service.py`.
Interface:
Provider support via Anthropic + OpenAI SDKs:
Config: `~/.pyrite/config.yaml` settings (ai_provider, ai_model, ai_api_base). Keys from environment variables.
Migrate `QueryExpansionService` to use `LLMService` internally. Add `GET /api/ai/status` endpoint.
See ADR-0007 for full rationale. This is the foundation all AI features build on.
Completed
Implemented in Wave 2. `LLMService` with Anthropic, OpenAI (+ OpenRouter/Ollama via base_url), and stub backends. Lazy SDK imports with helpful error messages. `GET /api/ai/status` endpoint. 19 tests with mocked SDK calls.