Skip to main content
Version: 1.0

Getting started

When consuming Continuum via the public API you have to verify the deployment and take care of prompt encryption. The easiest way to handle these tasks is to rely on the continuum-proxy.

If you want to interact with API directly, see the OpenAI API section for more information.

Authentication

To interact with Continuum's SaaS API, you need to authenticate using an API key. Your requests must include your API key in the Authorization HTTP header:

Authorization: Bearer <CONTINUUM_API_KEY>

Alternatively, you can configure the continuum-proxy to add the Authorization header to your requests.

Send an email to continuum-preview@edgeless.systems, stating that you want to test the API if you want a free API key.

Offered model

In the request body, the model parameter needs to be set to a value from the above list (see Open API Request body). For example, the model property of a request may be set like this: model: "hugging-quants/Meta-Llama-3.1-70B-Instruct-AWQ-INT4".

System prompts

The offered model supports setting a system prompt, as part of the request's messages field (see example below). This can be used to tailor the model's behavior to your specific needs.

Improving language accuracy

The model may occasionally make little language mistakes, especially for languages other than English. To optimize the language accuracy, you may set a system prompt. The following example, significantly improves the accuracy for the German language:

{
"role": "system",
"content": "Ensure every response is free from grammar and spelling errors. Use only valid words. Apply correct article usage, especially for languages with gender-specific articles like German. Follow standard grammar and syntax rules, and check spelling against standard dictionaries. Maintain consistency in style and terminology throughout."
}