Skip to main content
Version: 0.3

OpenAI API

Continuum seamlessly integrates with a subset of the OpenAI Chat API, enabling secure, encrypted interactions with AI models.

Chat

Completions

POST /v1/chat/completions

This endpoint generates a response to an encrypted prompt.

Request body

  • message string: The encrypted prompt for which a response is generated.
  • Additional parameters: These mirror the OpenAI API and are supported based on the model server's capabilities. However, options requiring internet access, such as image_url, aren't supported due to the sandboxed environment.

Returns

The response returns a chat completion or chat completion chunk object with the following modification:

Other parameters remain consistent with the OpenAI API specifications.

Example request

curl https://api.ai.confidential.cloud/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "/model",
"messages": "<encrypted-payload>"
}'

Example response

{
"id": "chatcmpl-123",
"object": "chat.completion",
"created": 1677652288,
"model": "/model",
"system_fingerprint": "fp_44709d6fcb",
"choices": "<encrypted-response>",
"usage": {
"prompt_tokens": 9,
"completion_tokens": 12,
"total_tokens": 21
}
}

Models

List models

GET /v1/models

This endpoint lists all currently available models.

Returns

The response is a model object. Since the information retrieved from this endpoint isn't confidential, no encryption is applied to the response.

For detailed information, refer to the OpenAI API documentation.