Skip to main content
Version: Next

Integration with a Chat UI

Out of the box, the Continuum API is just that, an API. To enable end users to interact with it, a graphical user interface is needed. To get you up and running with a quick example you can follow this tutorial on how to integrate the popular chat-ui interface from HuggingFace.

1. Store configuration

Copy the below configuration into a file .env.local on your computer. In .env.local, replace <continuum_apikey> with your Continuum apikey. Save the file.

# Allow the chat-ui to connect to the continuum-proxy via HTTP.
REJECT_UNAUTHORIZED=false

# Supply the Continuum apikey here so requests are properly authorized.
OPENAI_API_KEY=<continuum_apikey>

# Configure chat-ui to talk to Continuum.
MODELS=`[{
"id": "ibnzterrell/Meta-Llama-3.3-70B-Instruct-AWQ-INT4",
"name": "ibnzterrell/Meta-Llama-3.3-70B-Instruct-AWQ-INT4",
"displayName": "ibnzterrell/Meta-Llama-3.3-70B-Instruct-AWQ-INT4",
"parameters": {
"temperature": 0.5,
"max_new_tokens": 4096,
},
"endpoints": [
{
"type": "openai",
"baseURL": "http://localhost:8080/v1",
"defaultQuery": {
"api-version": "2023-05-15"
},
"authorization": "Bearer <continuum_apikey>"
}
]
}]`

2. Run continuum-proxy

Start the continuum-proxy as described in the quickstart guide:

docker run -p 8080:8080 ghcr.io/edgelesssys/continuum/continuum-proxy:latest

The command exposes the proxy on your host port 8080. If that port is in use, change the host port to a free port, for example 9090:

docker run -p 9090:8080 ghcr.io/edgelesssys/continuum/continuum-proxy:latest

3. Run HuggingFace chat-ui

Start the chat-ui docker container and mount the configuration file created in step 1 into the container:

docker run --net=host -v $(realpath .env.local):"/app/.env.local" -v db:/data ghcr.io/huggingface/chat-ui-db:latest

If you changed the port in step 2, remember to updated the baseURL property inside .env.local. Note that the docker command exposes all services inside the container on your host's network namespace. This means that the chat-ui service and the MongoDB instance are reachable from other machines in your network.

4. Experiment

Now you have a starting point to experiment with the Continuum API. You can also start modifying chat-ui to fit your needs. For example, you could change chat-ui's theme as described in the docs.