This sample demonstrates how to use Azure OpenAI text completions with Azure Functions the Azure OpenAI extension.
This application is made from multiple components:
-
A serverless API built with Azure Functions using Azure Functions bindings for OpenAI.
-
Hosted AI models with Azure OpenAI.
- Node.js LTS
- Azure Developer CLI
- Git
- Azure account. If you're new to Azure, get an Azure account for free to get free Azure credits to get started. If you're a student, you can also get free credits with Azure for Students.
- Azure account permissions:
- Your Azure account must have
Microsoft.Authorization/roleAssignments/write
permissions, such as Role Based Access Control Administrator, User Access Administrator, or Owner. If you don't have subscription-level permissions, you must be granted RBAC for an existing resource group and deploy to that existing group. - Your Azure account also needs
Microsoft.Resources/deployments/write
permissions on the subscription level.
- Your Azure account must have
Pricing varies per region and usage, so it isn't possible to predict exact costs for your usage. However, you can use the Azure pricing calculator for the resources below to get an estimate.
- Azure Functions: Flex Consumption plan, Free for the first 250K executions. Pricing per execution and memory used. Pricing
- Azure OpenAI: Standard tier, chat model. Pricing per 1K tokens used, and at least 1K tokens are used per question. Pricing
- Azure Blob Storage: Standard tier with LRS. Pricing per GB stored and data transfer. Pricing
azd down --purge
.
You can run this project directly in your browser by using GitHub Codespaces, which will open a web-based VS Code.
- Fork the project to create your own copy of this repository.
- On your forked repository, select the Code button, then the Codespaces tab, and clink on the button Create codespace on main.
- Wait for the Codespace to be created, it should take a few minutes.
If you prefer to run the project locally, follow these instructions.
Open a terminal in the project root and follow these steps to deploy the Azure resources needed:
# Open the sample directory
cd samples/openai-extension-textcompletion
# Install dependencies
npm install
# Deploy the sample to Azure
azd auth login
azd up
You will be prompted to select a base location for the resources. If you're unsure of which location to choose, select eastus2
.
The deployment process will take a few minutes.
Once the resources are deployed, you can run the following command to run the application locally:
npm start
This command will start the Azure Functions application locally. You can test the application by sending a GET request to the /whois
endpoint:
curl http://localhost:7071/api/whois/Albert%20Einstein
You should receive a response with information about Albert Einstein.
You also try sending a POST request to the /completions
endpoint:
curl http://localhost:7071/api/completions -H "Content-Type: application/json" -d '{"prompt": "Capital of France?"}'
You should receive a response with the completion of the prompt.
Alternatively, you can also open the file api.http
and click on Send Request to test the endpoints.
To clean up all the Azure resources created by this sample:
- Run
azd down --purge
- When asked if you are sure you want to continue, enter
y
The resource group and all the resources will be deleted.
Querying a LLM (Large Language Model) allows you to perform a wide range of tasks, such as generating completions, answering questions, summarizing text, and more. Here we either pass a prompt to the LLM directly, or use a prompt template with parameters from the query to generate an answer.
Open the src/functions
folder to see the code for the Azure Functions. Our API is composed of two endpoints:
-
POST /completions
: This endpoint take a JSON object with aprompt
property and returns a completion generated by the LLM. It uses the OpenAI text completion input binding to generate the completion. -
GET /whois/<name>
: This endpoint takes a name as a route parameter and returns information about the person, using a prompt template to query the LLM through the OpenAI text completion input binding.
If you have any issue when running or deploying this sample, please check the troubleshooting guide. If you can't find a solution to your problem, please open an issue.
Here are some resources to learn more about the technologies used in this sample:
- Azure OpenAI text completion input binding for Azure Functions (Microsoft Learn)
- Azure Functions bindings for OpenAI (GitHub)
- Azure OpenAI Service (Microsoft Learn)
- Generative AI with JavaScript (GitHub)