feat: initial release - complete web chat interface with image generation
...
Implemented a modern web-based chat interface for multiple LLM backends
(LM Studio, Ollama, OpenRouter) with AI image generation via ComfyUI.
Features:
- Multi-server support with presets (Ollama, LM Studio, OpenRouter, Custom)
- API key management for cloud services
- Chat with streaming responses
- Image generation via /genimg command
- Lightbox for image zoom
- Code syntax highlighting with copy buttons
- LaTeX math rendering
- Multiple chat conversations with persistence
- Responsive mobile-friendly design
- Purple theme with smooth animations
Backend:
- FastAPI MCP server (port 8085) for ComfyUI integration
- Seed randomization to prevent cache duplicates
- Comprehensive logging
- Startup/stop scripts (start.sh, stop.sh)
Technical:
- Frontend: vanilla JS, HTML5, CSS3
- Backend: Python 3.8+, FastAPI, requests
- CORS configured for local development
- Works with local servers (localhost:11434, :1234) and cloud APIs
Setup:
1. Start ComfyUI on port 8188
2. Run ./start.sh
3. Open http://localhost:8084
4. Connect to your LLM and start chatting
Docs: README.md, PROGRESS.md