Skip to content

Commit

Permalink
[DOCS] REsolves conflicts.
Browse files Browse the repository at this point in the history
  • Loading branch information
szabosteve committed Mar 26, 2024
1 parent 4e6e1e5 commit db8737e
Show file tree
Hide file tree
Showing 8 changed files with 39 additions and 35 deletions.
9 changes: 5 additions & 4 deletions docs/reference/inference/delete-inference.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,9 @@ own model, use the <<ml-df-trained-models-apis>>.
[[delete-inference-api-request]]
==== {api-request-title}

`DELETE /_inference/<model_id>`
`DELETE /_inference/<task_type>/<model_id>`
`DELETE /_inference/<inference_id>`

`DELETE /_inference/<task_type>/<inference_id>`

[discrete]
[[delete-inference-api-prereqs]]
Expand All @@ -30,9 +31,9 @@ own model, use the <<ml-df-trained-models-apis>>.
[[delete-inference-api-path-params]]
==== {api-path-parms-title}

<model_id>::
<inference_id>::
(Required, string)
The unique identifier of the {infer} model to delete.
The unique identifier of the {infer} endpoint to delete.

<task_type>::
(Optional, string)
Expand Down
10 changes: 5 additions & 5 deletions docs/reference/inference/get-inference.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -18,11 +18,11 @@ own model, use the <<ml-df-trained-models-apis>>.

`GET /_inference/_all`

`GET /_inference/<model_id>`
`GET /_inference/<inference_id>`

`GET /_inference/<task_type>/_all`

`GET /_inference/<task_type>/<model_id>`
`GET /_inference/<task_type>/<inference_id>`

[discrete]
[[get-inference-api-prereqs]]
Expand All @@ -46,9 +46,9 @@ and a wildcard expression,
[[get-inference-api-path-params]]
==== {api-path-parms-title}

`<model_id>`::
`<inference_id>`::
(Optional, string)
The unique identifier of the {infer} model.
The unique identifier of the {infer} endpoint.


`<task_type>`::
Expand Down Expand Up @@ -76,7 +76,7 @@ The API returns the following response:
[source,console-result]
------------------------------------------------------------
{
"model_id": "my-elser-model",
"inference_id": "my-elser-model",
"task_type": "sparse_embedding",
"service": "elser",
"service_settings": {
Expand Down
13 changes: 7 additions & 6 deletions docs/reference/inference/post-inference.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,9 @@ own model, use the <<ml-df-trained-models-apis>>.
[[post-inference-api-request]]
==== {api-request-title}

`POST /_inference/<model_id>`
`POST /_inference/<task_type>/<model_id>`
`POST /_inference/<inference_id>`

`POST /_inference/<task_type>/<inference_id>`


[discrete]
Expand All @@ -31,8 +32,8 @@ own model, use the <<ml-df-trained-models-apis>>.
[[post-inference-api-desc]]
==== {api-description-title}

The perform {infer} API enables you to use {infer} models to perform specific
tasks on data that you provide as an input. The API returns a response with the
The perform {infer} API enables you to use {ml} models to perform specific tasks
on data that you provide as an input. The API returns a response with the
resutls of the tasks. The {infer} model you use can perform one specific task
that has been defined when the model was created with the <<put-inference-api>>.

Expand All @@ -41,9 +42,9 @@ that has been defined when the model was created with the <<put-inference-api>>.
[[post-inference-api-path-params]]
==== {api-path-parms-title}

`<model_id>`::
`<inference_id>`::
(Required, string)
The unique identifier of the {infer} model.
The unique identifier of the {infer} endpoint.


`<task_type>`::
Expand Down
20 changes: 10 additions & 10 deletions docs/reference/inference/put-inference.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ or if you want to use non-NLP models, use the <<ml-df-trained-models-apis>>.
[[put-inference-api-desc]]
==== {api-description-title}

The create {infer} API enables you to create and configure an {infer} model to
The create {infer} API enables you to create and configure a {ml} model to
perform a specific {infer} task.

The following services are available through the {infer} API:
Expand All @@ -49,9 +49,9 @@ The following services are available through the {infer} API:
==== {api-path-parms-title}


`<model_id>`::
`<inference_id>`::
(Required, string)
The unique identifier of the model.
The unique identifier of the {infer} endpoint.

`<task_type>`::
(Required, string)
Expand Down Expand Up @@ -245,7 +245,7 @@ This section contains example API calls for every service type.
[[inference-example-cohere]]
===== Cohere service

The following example shows how to create an {infer} model called
The following example shows how to create an {infer} endpoint called
`cohere_embeddings` to perform a `text_embedding` task type.

[source,console]
Expand All @@ -267,7 +267,7 @@ PUT _inference/text_embedding/cohere-embeddings
[[inference-example-e5]]
===== E5 via the elasticsearch service

The following example shows how to create an {infer} model called
The following example shows how to create an {infer} endpoint called
`my-e5-model` to perform a `text_embedding` task type.

[source,console]
Expand All @@ -292,7 +292,7 @@ further details, refer to the {ml-docs}/ml-nlp-e5.html[E5 model documentation].
[[inference-example-elser]]
===== ELSER service

The following example shows how to create an {infer} model called
The following example shows how to create an {infer} endpoint called
`my-elser-model` to perform a `sparse_embedding` task type.

[source,console]
Expand All @@ -314,7 +314,7 @@ Example response:
[source,console-result]
------------------------------------------------------------
{
"model_id": "my-elser-model",
"inference_id": "my-elser-model",
"task_type": "sparse_embedding",
"service": "elser",
"service_settings": {
Expand All @@ -331,7 +331,7 @@ Example response:
[[inference-example-hugging-face]]
===== Hugging Face service

The following example shows how to create an {infer} model called
The following example shows how to create an {infer} endpoint called
`hugging-face_embeddings` to perform a `text_embedding` task type.

[source,console]
Expand Down Expand Up @@ -361,7 +361,7 @@ after the endpoint initialization has been finished.
[[inference-example-eland]]
===== Models uploaded by Eland via the elasticsearch service

The following example shows how to create an {infer} model called
The following example shows how to create an {infer} endpoint called
`my-msmarco-minilm-model` to perform a `text_embedding` task type.

[source,console]
Expand All @@ -386,7 +386,7 @@ been
[[inference-example-openai]]
===== OpenAI service

The following example shows how to create an {infer} model called
The following example shows how to create an {infer} endpoint called
`openai_embeddings` to perform a `text_embedding` task type.

[source,console]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,9 +23,9 @@ include::{es-repo-dir}/tab-widgets/inference-api/infer-api-requirements-widget.a

[discrete]
[[infer-text-embedding-task]]
==== Create the inference task
==== Create an inference endpoint

Create the {infer} task by using the <<put-inference-api>>:
Create an {infer} endpoint by using the <<put-inference-api>>:

include::{es-repo-dir}/tab-widgets/inference-api/infer-api-task-widget.asciidoc[]

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,8 @@ PUT _ingest/pipeline/cohere_embeddings
]
}
--------------------------------------------------
<1> The name of the inference configuration you created by using the
<<put-inference-api>>.
<1> The name of the inference endpoint you created by using the
<<put-inference-api>>, it's referred to as `inference_id` in that step.
<2> Configuration object that defines the `input_field` for the {infer} process
and the `output_field` that will contain the {infer} results.

Expand All @@ -55,8 +55,8 @@ PUT _ingest/pipeline/openai_embeddings
]
}
--------------------------------------------------
<1> The name of the inference configuration you created by using the
<<put-inference-api>>.
<1> The name of the inference endpoint you created by using the
<<put-inference-api>>, it's referred to as `inference_id` in that step.
<2> Configuration object that defines the `input_field` for the {infer} process
and the `output_field` that will contain the {infer} results.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ GET cohere-embeddings/_search
"field": "content_embedding",
"query_vector_builder": {
"text_embedding": {
"model_id": "cohere_embeddings",
"inference_id": "cohere_embeddings",
"model_text": "Muscles in human body"
}
},
Expand Down Expand Up @@ -83,7 +83,7 @@ GET openai-embeddings/_search
"field": "content_embedding",
"query_vector_builder": {
"text_embedding": {
"model_id": "openai_embeddings",
"inference_id": "openai_embeddings",
"model_text": "Calculate fuel cost"
}
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,8 @@ PUT _inference/text_embedding/cohere_embeddings <1>
}
------------------------------------------------------------
// TEST[skip:TBD]
<1> The task type is `text_embedding` in the path.
<1> The task type is `text_embedding` in the path and the `inference_id` which
is the unique identifier of the {infer} endpoint is `cohere_embeddings`.
<2> The API key of your Cohere account. You can find your API keys in your
Cohere dashboard under the
https://dashboard.cohere.com/api-keys[API keys section]. You need to provide
Expand Down Expand Up @@ -46,7 +47,8 @@ PUT _inference/text_embedding/openai_embeddings <1>
}
------------------------------------------------------------
// TEST[skip:TBD]
<1> The task type is `text_embedding` in the path.
<1> The task type is `text_embedding` in the path and the `inference_id` which
is the unique identifier of the {infer} endpoint is `openai_embeddings`.
<2> The API key of your OpenAI account. You can find your OpenAI API keys in
your OpenAI account under the
https://platform.openai.com/api-keys[API keys section]. You need to provide
Expand Down

0 comments on commit db8737e

Please sign in to comment.