Skip to main content

Prerequisites

DependencyVersionRequired
Node.js18+Yes
PostgreSQL14+Yes
Python3.11+No (for ML Worker)
Redis6+No (recommended)

Install PostgreSQL

brew install postgresql@16
brew services start postgresql@16
createdb kaireon

Install Redis (Optional)

KaireonAI uses Redis for enrichment data caching and API rate limiting. Without it, the platform runs fine but skips these features.
brew install redis
brew services start redis

Platform Setup

1

Clone and install

git clone https://github.com/kaireonai/platform.git
cd platform
npm install
2

Create .env file

cp .env.example .env
Edit .env:
DATABASE_URL=postgresql://postgres:postgres@localhost:5432/kaireon
REDIS_URL=redis://localhost:6379
NEXTAUTH_SECRET=local-dev-secret-change-in-production
3

Initialize the database

npx prisma generate
npx prisma db push
npx tsx prisma/seed.ts
The seed script creates the default tenant and an admin user:
  • Email: admin@kaireonai.com
  • Password: admin123
4

Start the development server

npm run dev
Open http://localhost:3000 and sign in with the admin credentials.

ML Worker Setup (Optional)

The ML Worker provides scikit-learn-based analysis for AI features. It’s optional — all AI features fall back to LLM-based analysis without it.
1

Set up environment

cd ml-worker
cp .env.example .env
Edit ml-worker/.env to match your local database:
DATABASE_URL=postgresql://postgres:postgres@localhost:5432/kaireon
2

Install Python dependencies

pip install -r requirements.txt
3

Start the ML Worker

python -m uvicorn app.main:app --host 0.0.0.0 --port 8000
4

Connect the platform

Add to platform/.env:
ML_WORKER_URL=http://localhost:8000
Restart the Next.js dev server. The AI > Insights page should show “ML Worker Connected”.

Docker Compose (Full Stack)

Run the entire stack with Docker Compose instead of installing each dependency:
# Platform only
POSTGRES_PASSWORD=secret NEXTAUTH_SECRET=dev-secret docker compose up -d

# Platform + ML Worker
POSTGRES_PASSWORD=secret NEXTAUTH_SECRET=dev-secret docker compose --profile ml up -d
This starts PostgreSQL, PgBouncer, Redis, the API, the background worker, and optionally the ML Worker.
The ML Worker includes a health check (/health endpoint) that Docker monitors automatically. If the worker becomes unhealthy, Docker will restart it. You can check its status with docker compose ps.

Verify Installation

After signing in, check the home dashboard — you should see cards for Decision Flows, Offers, Channels, etc. all at zero counts. To load demo content, go to Settings → Sample Data and load the Starbucks Offers dataset. This will populate the platform with schemas, offers, channels, models, and creatives.

Running Tests

npm test              # Run tests in watch mode
npm run test:coverage # Run with coverage report
npm run build         # Type-check + production build

Next Steps