This sample shows how to take a human prompt as HTTP Get or Post input, calculates the completions using chains of human input and templates. This is a starting point that can be used for more sophisticated chains.
- Python 3.8+
- Azure Functions Core Tools
- Azure OpenAPI API key, endpoint, and deployment
- Add this
local.settings.json
file to this folder to simplify local development and include Key from step 3
./local.settings.json
{
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "python",
"AzureWebJobsFeatureFlags": "EnableWorkerIndexing",
"AzureWebJobsStorage": "",
"AZURE_OPENAI_KEY": "...",
"AZURE_OPENAI_ENDPOINT": "https://<service_name>.openai.azure.com/",
"AZURE_OPENAI_SERVICE": "...",
"AZURE_OPENAI_CHATGPT_DEPLOYMENT": "...",
"OPENAI_API_VERSION": "2023-05-15",
"USE_LANGCHAIN": "True"
}
}
- Open a new terminal and do the following:
pip3 install -r requirements.txt
func start
- Using your favorite REST client, e.g. RestClient in VS Code, PostMan, curl, make a post.
test.http
has been provided to run this quickly.
Terminal:
curl -i -X POST http://localhost:7071/api/ask/ \
-H "Content-Type: text/json" \
--data-binary "@testdata.json"
testdata.json
{
"prompt": "What is a good feature of Azure Functions?"
}
test.http
POST http://localhost:7071/api/ask HTTP/1.1
content-type: application/json
{
"prompt": "What is a good feature of Azure Functions?"
}
- Open this repo in VS Code:
code .
-
Follow the prompts to load Function. It is recommended to Initialize the Functions Project for VS Code, and also to enable a virtual environment for your chosen version of Python.
-
Run and Debug
F5
the app -
Test using same REST client steps above
The key code that makes this work is as follows in function_app.py. You can customize this or learn more snippets using the LangChain Quickstart Guide.
llm = AzureOpenAI(deployment_name=AZURE_OPENAI_CHATGPT_DEPLOYMENT, temperature=0.3, openai_api_key=AZURE_OPENAI_KEY)
llm_prompt = PromptTemplate(
input_variables=["human_prompt"],
template="The following is a conversation with an AI assistant. The assistant is helpful.\n\nAI: I am an AI created by OpenAI. How can I help you today?\nHuman: {human_prompt}?",
)
from langchain.chains import LLMChain
chain = LLMChain(llm=llm, prompt=llm_prompt)
return chain.run(prompt) # prompt is human input from request body
The easiest way to deploy this app is using the Azure Dev CLI. If you open this repo in GitHub CodeSpaces the AZD tooling is already preinstalled.
To provision and deploy:
azd up