This example demos how to deploy flow as a docker app. We will use web-classification as example in this tutorial.
Note that all dependent connections must be created before building as docker.
# create connection if not created before
pf connection create --file ../../../connections/azure_openai.yml --set api_key=<your_api_key> api_base=<your_api_base> --name open_ai_connection
Use the command below to build a flow as docker format app:
pf flow build --source ../../../flows/standard/web-classification --output dist --format docker
Like other Dockerfile, you need to build the image first. You can tag the image with any name you want. In this example, we use promptflow-serve
.
Run the command below to build image:
docker build dist -t web-classification-serve
Run the docker image will start a service to serve the flow inside the container.
If the service involves connections, all related connections will be exported as yaml files and recreated in containers. Secrets in connections won't be exported directly. Instead, we will export them as a reference to environment variables:
$schema: https://azuremlschemas.azureedge.net/promptflow/latest/OpenAIConnection.schema.json
type: open_ai
name: open_ai_connection
module: promptflow.connections
api_key: ${env:OPEN_AI_CONNECTION_API_KEY} # env reference
You'll need to set up the environment variables in the container to make the connections work.
You can run the docker image directly set via below commands:
# The started service will listen on port 8080.You can map the port to any port on the host machine as you want.
docker run -p 8080:8080 -e OPEN_AI_CONNECTION_API_KEY=<secret-value> web-classification-serve
After start the service, you can use curl to test it:
curl http://localhost:8080/score --data '{"url":"https://play.google.com/store/apps/details?id=com.twitter.android"}' -X POST -H "Content-Type: application/json"