Conversational AI interface that can answer questions using KB content:
How it works: 1. User asks a question in chat sidebar 2. Backend retrieves relevant entries via search (FTS5 + optional semantic) 3. Retrieved entries sent as context to LLM along with the question 4. LLM responds with citations to specific entries 5. Citations are clickable links to entries
Implementation:
UI:
Depends on: llm-abstraction-service, web-ai-summarize-and-tag (shared /api/ai/ patterns)