Configure an AI-powered assistant that can chat with you and your guests.
/chat/completions endpoint baseHow it works
When you (or a guest in autonomous mode) send a message to the AI bot, your server queues the request, calls the LLM provider, and delivers the response as a regular chat message. The bot appears as a contact in your chat list. All API calls happen server-side โ your key is never exposed to clients.