Perform text embedding inference on the service Generally available; Added in 8.11.0

POST /_inference/text_embedding/{inference_id}

Path parameters

Query parameters

  • timeout string

    Specifies the amount of time to wait for the inference request to complete.

    Values are -1 or 0.

application/json

Body

Responses

  • 200 application/json
    Hide response attributes Show response attributes object
    • The text embedding result object for byte representation

      Hide text_embedding_bytes attribute Show text_embedding_bytes attribute object
      • embedding array[number] Required

        Text Embedding results containing bytes are represented as Dense Vectors of bytes.

    • text_embedding_bits array[object]

      The text embedding result object for byte representation

      Hide text_embedding_bits attribute Show text_embedding_bits attribute object
      • embedding array[number] Required

        Text Embedding results containing bytes are represented as Dense Vectors of bytes.

    • text_embedding array[object]

      The text embedding result object

      Hide text_embedding attribute Show text_embedding attribute object
      • embedding array[number] Required

        Text Embedding results are represented as Dense Vectors of floats.

POST /_inference/text_embedding/{inference_id}
POST _inference/text_embedding/my-cohere-endpoint
{
  "input": "The sky above the port was the color of television tuned to a dead channel.",
  "task_settings": {
    "input_type": "ingest"
  }
}
resp = client.inference.text_embedding(
    inference_id="my-cohere-endpoint",
    input="The sky above the port was the color of television tuned to a dead channel.",
    task_settings={
        "input_type": "ingest"
    },
)
const response = await client.inference.textEmbedding({
  inference_id: "my-cohere-endpoint",
  input:
    "The sky above the port was the color of television tuned to a dead channel.",
  task_settings: {
    input_type: "ingest",
  },
});
response = client.inference.text_embedding(
  inference_id: "my-cohere-endpoint",
  body: {
    "input": "The sky above the port was the color of television tuned to a dead channel.",
    "task_settings": {
      "input_type": "ingest"
    }
  }
)
$resp = $client->inference()->textEmbedding([
    "inference_id" => "my-cohere-endpoint",
    "body" => [
        "input" => "The sky above the port was the color of television tuned to a dead channel.",
        "task_settings" => [
            "input_type" => "ingest",
        ],
    ],
]);
curl -X POST -H "Authorization: ApiKey $ELASTIC_API_KEY" -H "Content-Type: application/json" -d '{"input":"The sky above the port was the color of television tuned to a dead channel.","task_settings":{"input_type":"ingest"}}' "$ELASTICSEARCH_URL/_inference/text_embedding/my-cohere-endpoint"
Request example
Run `POST _inference/text_embedding/my-cohere-endpoint` to perform text embedding on the example sentence using the Cohere integration,
{
  "input": "The sky above the port was the color of television tuned to a dead channel.",
  "task_settings": {
    "input_type": "ingest"
  }
}
Response examples (200)
An abbreviated response from `POST _inference/text_embedding/my-cohere-endpoint`.
{
  "text_embedding": [
    {
      "embedding": [
        {
          0.018569946,
          -0.036895752,
          0.01486969,
          -0.0045204163,
          -0.04385376,
          0.0075950623,
          0.04260254,
          -0.004005432,
          0.007865906,
          0.030792236,
          -0.050476074,
          0.011795044,
          -0.011642456,
          -0.010070801
        }
      ]
    }
  ]
}