Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.kaireonai.com/llms.txt

Use this file to discover all available pages before exploring further.

Connector status: 80 connector types are registered in the UI. Two — amazon_kinesis and braze — ship as coming-soon: they expose create/edit forms (and a working Test Connection probe for Amazon Kinesis), but pipeline runs that source from them no-op (the executor logs a message and returns zero rows). The 26 W16 expansion entries documented on Connectors Expanded are also coming-soon. Every other registered type is production-ready.

GET /api/v1/connectors

List all connectors for the current tenant. Supports cursor-based pagination.

Query Parameters

ParameterTypeDefaultDescription
limitinteger20Max results per page (max 100)
cursorstringCursor for pagination (ID of last item from previous page)

Response

{
  "data": [
    {
      "id": "conn_001",
      "name": "Production Snowflake",
      "type": "snowflake",
      "description": "Main data warehouse",
      "config": { "account": "xy12345.us-east-1", "warehouse": "COMPUTE_WH", "database": "ANALYTICS" },
      "authMethod": "access_key",
      "status": "active",
      "lastTestedAt": "2026-03-15T10:30:00.000Z",
      "lastError": null,
      "createdAt": "2026-01-10T08:00:00.000Z",
      "updatedAt": "2026-03-15T10:30:00.000Z"
    }
  ],
  "pagination": {
    "total": 12,
    "hasMore": false,
    "limit": 50,
    "cursor": null
  }
}
The authConfig field is never returned in list or detail responses to prevent secret leakage.

POST /api/v1/connectors

Create a new connector.

Request Body

FieldTypeRequiredDescription
namestringYesUnique connector name
typestringYesConnector type (see supported types below)
descriptionstringNoHuman-readable description
configobjectNoType-specific configuration (host, port, bucket, etc.). Also accepts connectionConfig as an alias.
authMethodstringNoAuthentication method. Default: "access_key"
authConfigobjectNoCredentials (encrypted at rest, never returned in responses)
The config field contains type-specific settings like bucket, region, prefix for S3 or account, warehouse, database for Snowflake. You can also send connectionConfig as an alias — the API accepts both names and merges them.

Supported Connector Types

TypeStatusRequired Config Fields
postgresql, mysql, redshiftReadyhost, port, database
snowflakeReadyaccount, warehouse, database, sourceTable, rowLimit (optional)
bigqueryReadyproject, dataset, sourceTable, rowLimit (optional)
mongodbReadyhost or connectionString
kafka, confluent_kafkaReady (batch polling)bootstrapServers
aws_s3Readybucket, region
gcsReadybucket, project
azure_blobReadycontainer, storageAccount
sftpReadyhost, port
rest_api, webhookReadyurl
salesforce, hubspotReadyinstanceUrl or apiKey
databricksReadyhost, httpPath
segment, shopify, stripe, mailchimpReadySee /platform/data for fields
amazon_kinesisComing soon (connection test works)streamName, region
brazeComing soonSee /platform/data for fields
Kafka is batch polling, not true streaming. Each pipeline run opens a consumer, reads up to maxMessages records (default 1000) with a configurable wait timeout (default 15 seconds), commits offsets, and closes. True long-lived streaming requires a persistent worker that is not yet implemented.
Snowflake and BigQuery row limits. Both connectors accept a sourceTable (required) and rowLimit (optional). The executor issues SELECT * FROM <sourceTable> LIMIT <rowLimit>. The default rowLimit is 100,000 rows (demo-safe). Set to 0 to remove the cap — only do this once you have sized the target database and pipeline run budget to handle full-table reads.

Example

curl -X POST https://playground.kaireonai.com/api/v1/connectors \
  -H "Content-Type: application/json" \
  -H "X-Tenant-Id: my-tenant" \
  -d '{
    "name": "Customer Data Warehouse",
    "type": "snowflake",
    "config": {
      "account": "xy12345.us-east-1",
      "warehouse": "COMPUTE_WH",
      "database": "ANALYTICS"
    },
    "authMethod": "access_key",
    "authConfig": {
      "username": "svc_kaireon",
      "password": "secret"
    }
  }'
Response: 201 Created with the connector object (excluding authConfig).

PUT /api/v1/connectors

Update an existing connector. Only provided fields are updated.

Request Body

FieldTypeRequiredDescription
idstringYesConnector ID
namestringNoUpdated name
typestringNoUpdated type
descriptionstringNoUpdated description
configobjectNoUpdated configuration
authMethodstringNoUpdated auth method
authConfigobjectNoUpdated credentials (re-encrypted)
statusstringNoUpdated status
Response: 200 OK with the updated connector (excluding authConfig).

DELETE /api/v1/connectors

Delete a connector by ID.

Query Parameters

ParameterTypeRequiredDescription
idstringYesConnector ID to delete
Response: 204 No Content

POST /api/v1/connectors/test

Test a connector’s connection by performing a real probe (TCP, HTTP, or SDK-specific check). Rate limited to 100 requests per 60 seconds.

Request Body

FieldTypeRequiredDescription
idstringYesID of the connector to test

Response

{
  "id": "conn_001",
  "name": "Production Snowflake",
  "type": "snowflake",
  "status": "active",
  "lastTestedAt": "2026-03-16T14:30:00.000Z",
  "lastError": null,
  "authConfig": { "username": "sv***vc", "password": "****" }
}
The test endpoint:
  • Validates required configuration fields for the connector type
  • Performs a real connection probe (TCP for databases, HTTP HEAD for REST/webhook, bucket checks for cloud storage, SDK-specific probes for Databricks and Amazon Kinesis)
  • Updates the connector’s status to "active" (success) or "error" (failure)
  • Uses a circuit breaker to prevent hammering failed connectors
  • Includes SSRF protection (blocks private IPs, validates DNS resolution)
  • Returns masked authConfig values (first 2 + last 2 characters visible)
Connection testing for the coming-soon braze connector reports a generic failure until a dedicated probe is wired. Amazon Kinesis has a working test probe today.

Error Responses

StatusCause
400Missing id or invalid JSON
404Connector not found
429Rate limit exceeded

Roles

EndpointAllowed Roles
GET /connectorsadmin, editor, viewer
POST /connectorsadmin, editor
PUT /connectorsadmin, editor
DELETE /connectorsadmin, editor
POST /connectors/testadmin, editor
See also: Data Platform