Prerequisites
Before starting, make sure you have sample data loaded so the AI assistant has entities to work with.This guide assumes you have completed the Quickstart and loaded the sample dataset. If you haven’t yet, follow the Sample Data guide first.
Open the AI Panel
Click the sparkle icon in the top navigation bar to open the AI assistant panel, or use the keyboard shortcut:- Mac:
Cmd + I - Windows / Linux:
Ctrl + I
Try These Conversations
Work through these five examples to see what the AI assistant can do.Ask about your offers
Type:The AI calls the
listOffers tool, returns the total count, and lists each offer by name with its category and status. This is a read operation, so it executes immediately with no confirmation needed.Check scoring models
Type:The AI calls
listModels and returns each model’s name, type (propensity, uplift, rule-based), and performance metrics like AUC. Use this to quickly audit your model inventory.Detect policy conflicts
Type:The AI calls
analyzePolicyConflicts to scan your contact policies for contradictions — for example, one rule suppressing a channel while another requires it. The response highlights any conflicts with severity and suggested fixes.Create a new channel
Type:This is a mutation, so the AI shows a preview of the channel it will create and asks you to approve or reject. Click Approve to proceed. The assistant then calls
createChannel and confirms the result.All write operations (create, update, delete) follow this guided autonomy pattern — you always see what will change before it happens.
How It Works
The AI assistant is powered by a tool-use architecture that connects natural language to platform operations.- 130+ MCP-aligned tools organized by category: studio, data, algorithms, intelligence, and docs. Every entity in the platform can be created, read, updated, or deleted through conversation.
- Guided autonomy: Read operations execute immediately and return results. Mutations (create, update, delete) show a preview with an approve/reject prompt so you stay in control.
-
Context-aware routing: The AI understands your data model and routes queries to the right tool automatically. Ask “show me my pipelines” and it calls
listPipelines; ask “explain this flow” and it callsgetDecisionFlow. - Multi-provider support: The assistant supports Google Gemini (default), Anthropic Claude, OpenAI, Amazon Bedrock, and Ollama for local models. Configure your preferred provider in Settings > AI Configuration.
What’s Next
AI Assistant Docs
Full reference for all assistant capabilities, tool categories, and conversation patterns.
AI Configuration
Set your preferred AI provider, API keys, and model parameters.
MCP Integration
Connect KaireonAI tools to external AI IDEs like Cursor, Windsurf, and Claude Code.