Ektos AI is in closed Beta!Join our Discord to become a tester. 

Embedding Models #

Get a vector representation of a given input that can be easily consumed by machine learning models and algorithms.

Using embedding models can be done directly by making requests to the HTTP API endpoint using your favorite tool, programming language, or with the client libraries for the OpenAI API.

With HTTP POST method, the endpoint for embedding models is: https://api.ektos.ai/v1/chat/embeddings

When making a request to the endpoint, select the model you want to use with the model body parameter.

Available models and corresponding API strings are listed here (type: TEBD for text embedding models).

A detailed description of the available parameters can be found in the API Reference specification.

Below are basic examples using cuRL and the OpenAI official client libraries for Python and Node.JS.

Examples in other programming languages are available in the API Reference specification (use the dropdown menu on the top right of the code block example): Link.

cuRL #

1curl --request POST \
2  --url https://api.ektos.ai/v1/embeddings \
3  --header "Authorization: Bearer $EKTOS_API_KEY" \
4  --header "Content-Type: application/json" \
5  -d '{
6    "input": "The food was delicious and the waiter...",
7    "model": "gte-multilingual-base",
8    "encoding_format": "float"
9  }'

Python: OpenAI client library #

 1import os
 2from openai import OpenAI
 3client = OpenAI(
 4    base_url="https://api.ektos.ai/v1/",
 5    api_key="YOUR_EKTOS_API_KEY_HERE"
 6    # api_key=os.environ.get("EKTOS_API_KEY")
 7)
 8
 9embeddings = client.embeddings.create(
10  model="gte-multilingual-base",
11  input="The food was delicious and the waiter...",
12  encoding_format="float"
13)
14print(embeddings.data)

JavaScript: Node.JS OpenAI client library #

 1import OpenAI from "openai";
 2
 3const openai = new OpenAI({
 4  apiKey: "$EKTOS_API_KEY",
 5  baseURL: 'https://api.ektos.ai/v1/',
 6});
 7
 8async function main() {
 9  const embedding = await openai.embeddings.create({
10    model: "gte-multilingual-base",
11    input: "The quick brown fox jumped over the lazy dog",
12    encoding_format: "float",
13  });
14
15  console.log(embedding);
16}