- Workflow Engine: n8n backed by a PostgreSQL 16 database.
- Local LLM Host: Ollama configured with NVIDIA GPU passthrough for high-performance local inference.
- Web Interface: Open WebUI for interacting with Ollama models and managing chats.
- Security: Cloudflare Tunnel (cloudflared) to expose services safely without opening firewall ports.
- Maintenance: Watchtower to keep all images updated automatically.