Prerequisites
| Dependency | Version | Required |
|---|
| Node.js | 18+ | Yes |
| PostgreSQL | 14+ | Yes |
| Python | 3.11+ | No (for ML Worker) |
| Redis | 6+ | No (recommended) |
Install PostgreSQL
macOS
Ubuntu/Debian
Docker
brew install postgresql@16
brew services start postgresql@16
createdb kaireon
sudo apt install postgresql postgresql-contrib
sudo systemctl start postgresql
sudo -u postgres createdb kaireon
docker run -d --name kaireon-pg \
-e POSTGRES_DB=kaireon \
-e POSTGRES_USER=postgres \
-e POSTGRES_PASSWORD=postgres \
-p 5432:5432 \
postgres:16
Install Redis (Optional)
KaireonAI uses Redis for enrichment data caching and API rate limiting. Without it, the platform runs fine but skips these features.
macOS
Ubuntu/Debian
Docker
brew install redis
brew services start redis
sudo apt install redis-server
sudo systemctl start redis-server
docker run -d --name kaireon-redis -p 6379:6379 redis:7-alpine
Clone and install
git clone https://github.com/kaireonai/platform.git
cd platform
npm install
Create .env file
Edit .env:DATABASE_URL=postgresql://postgres:postgres@localhost:5432/kaireon
REDIS_URL=redis://localhost:6379
NEXTAUTH_SECRET=local-dev-secret-change-in-production
Initialize the database
npx prisma generate
npx prisma db push
npx tsx prisma/seed.ts
The seed script creates the default tenant and an admin user:
- Email:
admin@kaireonai.com
- Password:
admin123
Start the development server
ML Worker Setup (Optional)
The ML Worker provides scikit-learn-based analysis for AI features. It’s optional — all AI features fall back to LLM-based analysis without it.
Set up environment
cd ml-worker
cp .env.example .env
Edit ml-worker/.env to match your local database:DATABASE_URL=postgresql://postgres:postgres@localhost:5432/kaireon
Install Python dependencies
pip install -r requirements.txt
Start the ML Worker
python -m uvicorn app.main:app --host 0.0.0.0 --port 8000
Connect the platform
Add to platform/.env:ML_WORKER_URL=http://localhost:8000
Restart the Next.js dev server. The AI > Insights page should show “ML Worker Connected”.
Docker Compose (Full Stack)
Run the entire stack with Docker Compose instead of installing each dependency:
# Platform only
POSTGRES_PASSWORD=secret NEXTAUTH_SECRET=dev-secret docker compose up -d
# Platform + ML Worker
POSTGRES_PASSWORD=secret NEXTAUTH_SECRET=dev-secret docker compose --profile ml up -d
This starts PostgreSQL, PgBouncer, Redis, the API, the background worker, and optionally the ML Worker.
The ML Worker includes a health check (/health endpoint) that Docker monitors automatically. If the worker becomes unhealthy, Docker will restart it. You can check its status with docker compose ps.
Verify Installation
After signing in, check the home dashboard — you should see cards for Decision Flows, Offers, Channels, etc. all at zero counts.
To load demo content, go to Settings → Sample Data and load the Starbucks Offers dataset. This will populate the platform with schemas, offers, channels, models, and creatives.
Running Tests
npm test # Run tests in watch mode
npm run test:coverage # Run with coverage report
npm run build # Type-check + production build
Next Steps