Encrypted OpenAI API
Continuum offers a subset of the OpenAI Chat API enabling secure, encrypted interactions with AI models. You should always verify the deployment before using the API to send your encrypted prompts. Alternatively, you can rely on the continuum-proxy to do this.
Below you will find details on how to encrypt your data before sending it to the Continuum API.
Chat
Completions
POST /v1/chat/completions
This endpoint generates a response to an encrypted prompt.
Request body
model
string: The offered LLM as listed here.message
string: The encrypted prompt for which a response is generated.- Additional parameters: These mirror the OpenAI API and are supported based on the model server's capabilities. However, options requiring internet access, such as
image_url
, aren't supported due to the sandboxed environment.
Returns
The response returns a chat completion or chat completion chunk object with the following modification:
choices
string: The encrypted response generated by the model.
Other parameters remain consistent with the OpenAI API specifications.
- Default
- Streaming
Example request
curl https://api.ai.confidential.cloud/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "<model-name>",
"messages": "<encrypted-payload>"
}'
Example response
{
"id": "chatcmpl-123",
"object": "chat.completion",
"created": 1677652288,
"model": "/model",
"system_fingerprint": "fp_44709d6fcb",
"choices": "<encrypted-response>",
"usage": {
"prompt_tokens": 9,
"completion_tokens": 12,
"total_tokens": 21
}
}
Example request
curl https://api.ai.confidential.cloud/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "<model-name>",
"messages": "<encrypted-payload>",
"stream": true
}'
Example response
{"id":"chatcmpl-123","object":"chat.completion.chunk","created":1694268190,"model":"<model-name>", "system_fingerprint": "fp_44709d6fcb", "choices":"<encrypted-response-0>"}
{"id":"chatcmpl-123","object":"chat.completion.chunk","created":1694268190,"model":"<model-name>", "system_fingerprint": "fp_44709d6fcb", "choices":"<encrypted-response-1>"}
....
{"id":"chatcmpl-123","object":"chat.completion.chunk","created":1694268190,"model":"<model-name>", "system_fingerprint": "fp_44709d6fcb", "choices":"<encrypted-response-2>"}
Models
List models
GET /v1/models
This endpoint lists all currently available models.
Returns
The response is a model object. Since the information retrieved from this endpoint isn't confidential, no encryption is applied to the response.
For detailed information, refer to the OpenAI API documentation.