OpenAI Compatibility

Currently, some large language models support calling through the OpenAI library. After installing Python 3.7.1 or later and setting up a virtual environment, you can install the OpenAI Python library. Run the following command in the terminal/command line:

pip install --upgrade openai

Si's API endpoints for chat, language and code, images, and embeddings are fully compatible with OpenAI's API.

For applications utilizing OpenAI's client libraries, configuring them to connect to Si's API servers is straightforward. This allows you to seamlessly execute your existing applications with our open-source models

Configuring OpenAI to use Si API

To start using Together with OpenAI's client libraries, pass in your Together API key to the api_key option, and change the base_url to https://api.together.xyz/v1:

import os
import openai

client = openai.OpenAI(
  api_key="YOUR_API_KEY",
  base_url="https://api.TBA/v1",
)

You can find your API key in your settings pagearrow-up-right.

Querying an Inference model

Now that your OpenAI client is configured to point to Si, you can start using one of our open-source models for your inference queries.

For example, you can query one of our chat modelsarrow-up-right, like Meta Llama 3:

Or you can use a language modelarrow-up-right to generate a code completion:

Streaming with OpenAI

You can also use OpenAI's streaming capabilities to stream back your response:

Community libraries

The Together API is also supported by most OpenAI libraries built by the communityarrow-up-right.

Feel free to reach out to supportarrow-up-right if you come across some unexpected behavior when using our API.

Last updated