A basic standard flow that calls Azure OpenAI with builtin llm tool.
Tools used in this flow:
prompt
tool- built-in
llm
tool
Connections used in this flow:
azure_open_ai
connection
Install promptflow sdk and other dependencies:
pip install -r requirements.txt
Prepare your Azure Open AI resource follow this instruction and get your api_key
if you don't have one.
Note in this example, we are using chat api, please use gpt-35-turbo
or gpt-4
model deployment.
Ensure you have created open_ai_connection
connection before.
pf connection show -n open_ai_connection
Create connection if you haven't done that. Ensure you have put your azure open ai endpoint key in azure_openai.yml file.
# Override keys with --set to avoid yaml file changes
pf connection create -f ../../../connections/azure_openai.yml --name open_ai_connection --set api_key=<your_api_key> api_base=<your_api_base>
# test with default input value in flow.dag.yaml
pf flow test --flow .
# test with inputs
pf flow test --flow . --inputs text="Python Hello World!"
- create run
pf run create --flow . --data ./data.jsonl --stream
- list and show run meta
# list created run
pf run list
# get a sample run name
name=$(pf run list -r 10 | jq '.[] | select(.name | contains("basic_with_builtin_llm")) | .name'| head -n 1 | tr -d '"')
# show specific run detail
pf run show --name $name
# show output
pf run show-details --name $name
# visualize run in browser
pf run visualize --name $name