Skip to content

Commit

Permalink
[DOCS] Changes model_id path param to inference_id.
Browse files Browse the repository at this point in the history
  • Loading branch information
szabosteve committed Mar 25, 2024
1 parent 8010b4e commit dd4b765
Show file tree
Hide file tree
Showing 8 changed files with 37 additions and 35 deletions.
8 changes: 4 additions & 4 deletions docs/reference/inference/delete-inference.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,9 @@ own model, use the <<ml-df-trained-models-apis>>.
[[delete-inference-api-request]]
==== {api-request-title}

`DELETE /_inference/<model_id>`
`DELETE /_inference/<inference_id>`

`DELETE /_inference/<task_type>/<model_id>`
`DELETE /_inference/<task_type>/<inference_id>`

[discrete]
[[delete-inference-api-prereqs]]
Expand All @@ -32,9 +32,9 @@ own model, use the <<ml-df-trained-models-apis>>.
[[delete-inference-api-path-params]]
==== {api-path-parms-title}

<model_id>::
<inference_id>::
(Required, string)
The unique identifier of the {infer} model to delete.
The unique {infer} identifier to delete.

<task_type>::
(Optional, string)
Expand Down
10 changes: 5 additions & 5 deletions docs/reference/inference/get-inference.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -18,11 +18,11 @@ own model, use the <<ml-df-trained-models-apis>>.

`GET /_inference/_all`

`GET /_inference/<model_id>`
`GET /_inference/<inference_id>`

`GET /_inference/<task_type>/_all`

`GET /_inference/<task_type>/<model_id>`
`GET /_inference/<task_type>/<inference_id>`

[discrete]
[[get-inference-api-prereqs]]
Expand All @@ -47,9 +47,9 @@ and a wildcard expression,
[[get-inference-api-path-params]]
==== {api-path-parms-title}

`<model_id>`::
`<inference_id>`::
(Optional, string)
The unique identifier of the {infer} model.
The unique {infer} identifier.


`<task_type>`::
Expand Down Expand Up @@ -77,7 +77,7 @@ The API returns the following response:
[source,console-result]
------------------------------------------------------------
{
"model_id": "my-elser-model",
"inference_id": "my-elser-model",
"task_type": "sparse_embedding",
"service": "elser",
"service_settings": {
Expand Down
12 changes: 6 additions & 6 deletions docs/reference/inference/post-inference.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,9 @@ own model, use the <<ml-df-trained-models-apis>>.
[[post-inference-api-request]]
==== {api-request-title}

`POST /_inference/<model_id>`
`POST /_inference/<inference_id>`

`POST /_inference/<task_type>/<model_id>`
`POST /_inference/<task_type>/<inference_id>`


[discrete]
Expand All @@ -32,8 +32,8 @@ own model, use the <<ml-df-trained-models-apis>>.
[[post-inference-api-desc]]
==== {api-description-title}

The perform {infer} API enables you to use {infer} models to perform specific
tasks on data that you provide as an input. The API returns a response with the
The perform {infer} API enables you to use {ml} models to perform specific tasks
on data that you provide as an input. The API returns a response with the
resutls of the tasks. The {infer} model you use can perform one specific task
that has been defined when the model was created with the <<put-inference-api>>.

Expand All @@ -42,9 +42,9 @@ that has been defined when the model was created with the <<put-inference-api>>.
[[post-inference-api-path-params]]
==== {api-path-parms-title}

`<model_id>`::
`<inference_id>`::
(Required, string)
The unique identifier of the {infer} model.
The unique {infer} identifier.


`<task_type>`::
Expand Down
20 changes: 10 additions & 10 deletions docs/reference/inference/put-inference.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ or if you want to use non-NLP models, use the <<ml-df-trained-models-apis>>.
[[put-inference-api-desc]]
==== {api-description-title}

The create {infer} API enables you to create and configure an {infer} model to
The create {infer} API enables you to create and configure a {ml} model to
perform a specific {infer} task.

The following services are available through the {infer} API:
Expand All @@ -50,9 +50,9 @@ The following services are available through the {infer} API:
==== {api-path-parms-title}


`<model_id>`::
`<inference_id>`::
(Required, string)
The unique identifier of the model.
The unique {infer} identifier.

`<task_type>`::
(Required, string)
Expand Down Expand Up @@ -246,7 +246,7 @@ This section contains example API calls for every service type.
[[inference-example-cohere]]
===== Cohere service

The following example shows how to create an {infer} model called
The following example shows how to create an {infer} entity called
`cohere_embeddings` to perform a `text_embedding` task type.

[source,console]
Expand All @@ -268,7 +268,7 @@ PUT _inference/text_embedding/cohere-embeddings
[[inference-example-e5]]
===== E5 via the elasticsearch service

The following example shows how to create an {infer} model called
The following example shows how to create an {infer} entity called
`my-e5-model` to perform a `text_embedding` task type.

[source,console]
Expand All @@ -293,7 +293,7 @@ further details, refer to the {ml-docs}/ml-nlp-e5.html[E5 model documentation].
[[inference-example-elser]]
===== ELSER service

The following example shows how to create an {infer} model called
The following example shows how to create an {infer} entity called
`my-elser-model` to perform a `sparse_embedding` task type.

[source,console]
Expand All @@ -315,7 +315,7 @@ Example response:
[source,console-result]
------------------------------------------------------------
{
"model_id": "my-elser-model",
"inference_id": "my-elser-model",
"task_type": "sparse_embedding",
"service": "elser",
"service_settings": {
Expand All @@ -332,7 +332,7 @@ Example response:
[[inference-example-hugging-face]]
===== Hugging Face service

The following example shows how to create an {infer} model called
The following example shows how to create an {infer} entity called
`hugging-face_embeddings` to perform a `text_embedding` task type.

[source,console]
Expand Down Expand Up @@ -362,7 +362,7 @@ after the endpoint initialization has been finished.
[[inference-example-eland]]
===== Models uploaded by Eland via the elasticsearch service

The following example shows how to create an {infer} model called
The following example shows how to create an {infer} entity called
`my-msmarco-minilm-model` to perform a `text_embedding` task type.

[source,console]
Expand All @@ -387,7 +387,7 @@ been
[[inference-example-openai]]
===== OpenAI service

The following example shows how to create an {infer} model called
The following example shows how to create an {infer} entity called
`openai_embeddings` to perform a `text_embedding` task type.

[source,console]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,9 +23,9 @@ include::{es-repo-dir}/tab-widgets/inference-api/infer-api-requirements-widget.a

[discrete]
[[infer-text-embedding-task]]
==== Create the inference task
==== Create an inference task entity

Create the {infer} task by using the <<put-inference-api>>:
Create an entity of an {infer} task by using the <<put-inference-api>>:

include::{es-repo-dir}/tab-widgets/inference-api/infer-api-task-widget.asciidoc[]

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ PUT _ingest/pipeline/cohere_embeddings
"processors": [
{
"inference": {
"model_id": "cohere_embeddings", <1>
"inference_id": "cohere_embeddings", <1>
"input_output": { <2>
"input_field": "content",
"output_field": "content_embedding"
Expand All @@ -28,7 +28,7 @@ PUT _ingest/pipeline/cohere_embeddings
]
}
--------------------------------------------------
<1> The name of the inference configuration you created by using the
<1> The name of the inference entity you created by using the
<<put-inference-api>>.
<2> Configuration object that defines the `input_field` for the {infer} process
and the `output_field` that will contain the {infer} results.
Expand All @@ -45,7 +45,7 @@ PUT _ingest/pipeline/openai_embeddings
"processors": [
{
"inference": {
"model_id": "openai_embeddings", <1>
"inference_id": "openai_embeddings", <1>
"input_output": { <2>
"input_field": "content",
"output_field": "content_embedding"
Expand All @@ -55,7 +55,7 @@ PUT _ingest/pipeline/openai_embeddings
]
}
--------------------------------------------------
<1> The name of the inference configuration you created by using the
<1> The name of the inference entity you created by using the
<<put-inference-api>>.
<2> Configuration object that defines the `input_field` for the {infer} process
and the `output_field` that will contain the {infer} results.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ GET cohere-embeddings/_search
"field": "content_embedding",
"query_vector_builder": {
"text_embedding": {
"model_id": "cohere_embeddings",
"inference_id": "cohere_embeddings",
"model_text": "Muscles in human body"
}
},
Expand Down Expand Up @@ -83,7 +83,7 @@ GET openai-embeddings/_search
"field": "content_embedding",
"query_vector_builder": {
"text_embedding": {
"model_id": "openai_embeddings",
"inference_id": "openai_embeddings",
"model_text": "Calculate fuel cost"
}
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,8 @@ PUT _inference/text_embedding/cohere_embeddings <1>
}
------------------------------------------------------------
// TEST[skip:TBD]
<1> The task type is `text_embedding` in the path.
<1> The task type is `text_embedding` in the path and the `inference_id` which
is the unique identifier of the {infer} entity is `cohere_embeddings`.
<2> The API key of your Cohere account. You can find your API keys in your
Cohere dashboard under the
https://dashboard.cohere.com/api-keys[API keys section]. You need to provide
Expand Down Expand Up @@ -46,7 +47,8 @@ PUT _inference/text_embedding/openai_embeddings <1>
}
------------------------------------------------------------
// TEST[skip:TBD]
<1> The task type is `text_embedding` in the path.
<1> The task type is `text_embedding` in the path and the `inference_id` which
is the unique identifier of the {infer} entity is `openai_embeddings`.
<2> The API key of your OpenAI account. You can find your OpenAI API keys in
your OpenAI account under the
https://platform.openai.com/api-keys[API keys section]. You need to provide
Expand Down

0 comments on commit dd4b765

Please sign in to comment.