Skip to content

Commit

Permalink
refine sample: export (microsoft#126)
Browse files Browse the repository at this point in the history
  • Loading branch information
wangchao1230 authored Aug 21, 2023
1 parent 2fc5ff6 commit 08e1867
Show file tree
Hide file tree
Showing 17 changed files with 568 additions and 175 deletions.
3 changes: 1 addition & 2 deletions docs/cloud/azureai/quick-start.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ This is an experimental feature, and may change at any time. Learn [more](https:
Assuming you have learned how to create and run a flow following [Quick start](../../how-to-guides/quick-start.md). This guide will walk you through the main process of how to submit a promptflow run to [Azure AI](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/overview-what-is-prompt-flow?view=azureml-api-2).

Benefits of use Azure AI comparison to just run locally:
- **Designed for team collaboration**: Portal UI is a better fix for sharing & presentation your flow and runs. And workspace can better orgnize team shared resources like connections.
- **Designed for team collaboration**: Portal UI is a better fix for sharing & presentation your flow and runs. And workspace can better organize team shared resources like connections.
- **Enterprise Readiness Solutions**: prompt flow leverages Azure AI's robust enterprise readiness solutions, providing a secure, scalable, and reliable foundation for the development, experimentation, and deployment of flows.

## Prerequisites
Expand Down Expand Up @@ -116,7 +116,6 @@ ml_client = MLClient(
resource_group_name="<RESOURCE_GROUP>",
workspace_name="<AML_WORKSPACE_NAME>",
)
# configure global setting pointing to workpsace ml_client
pf = PFClient(ml_client)
```

Expand Down
4 changes: 2 additions & 2 deletions docs/how-to-guides/manage-connections.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ print(result)
::::{tab-set}
:::{tab-item} CLI
:sync: CLI
The commands below show how to update exisiting connections with new values:
The commands below show how to update existing connections with new values:
```bash
# Update an azure open ai connection with a new api base
pf connection update -n my_azure_open_ai_connection --set api_base='new_value'
Expand All @@ -110,7 +110,7 @@ pf connection update -n my_custom_connection --set configs.other_config='new_val

:::{tab-item} SDK
:sync: SDK
The code snippet below shows how to update exisiting connections with new values:
The code snippet below shows how to update existing connections with new values:
```python
# Update an azure open ai connection with a new api base
connection = pf.connections.get(name="my_azure_open_ai_connection")
Expand Down
4 changes: 2 additions & 2 deletions docs/reference/tools-reference/embedding_tool.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Create OpenAI resources:

- **Azure OpenAI (AOAI)**

Create Azure OpenAI resources with [insturction](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/how-to/create-resource?pivots=web-portal)
Create Azure OpenAI resources with [instruction](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/how-to/create-resource?pivots=web-portal)

## **Connections**

Expand All @@ -22,7 +22,7 @@ Setup connections to provide resources in embedding tool.
| Type | Name | API KEY | API Type | API Version |
|-------------|----------|----------|----------|-------------|
| OpenAI | Required | Required | - | - |
| AzureOpenAI | Required | Requried | Required | Required |
| AzureOpenAI | Required | Required | Required | Required |


## Inputs
Expand Down
2 changes: 1 addition & 1 deletion docs/reference/tools-reference/vector_db_lookup_tool.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Vector DB Lookup

Vector DB Lookup is a vector search tool that allows users to search top k similiar vectors from vector database. This tool is a wrapper for multiple third-party vector databases. The list of current supported databases is as follows.
Vector DB Lookup is a vector search tool that allows users to search top k similar vectors from vector database. This tool is a wrapper for multiple third-party vector databases. The list of current supported databases is as follows.

| Name | Description |
| --- | --- |
Expand Down
6 changes: 3 additions & 3 deletions examples/flows/standard/web-classification/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ pf run show-details --name $run_name

create `evaluation` run:
```bash
# (Optional) save previous run name into variable, and create a new random run name for furthur use
# (Optional) save previous run name into variable, and create a new random run name for further use
prev_run_name=$run_name
run_name="classification_accuracy_"$(openssl rand -hex 12)
# create run using command line args
Expand All @@ -93,7 +93,7 @@ az configure --defaults group=<your_resource_group_name> workspace=<your_workspa
pfazure run create --flow . --data ./data.jsonl --stream --runtime demo-mir --subscription <your_subscription_id> -g <your_resource_group_name> -w <your_workspace_name>
# pfazure run create --flow . --data ./data.jsonl --stream # serverless compute

# (Optional) create a new random run name for furthur use
# (Optional) create a new random run name for further use
run_name="web_classification_"$(openssl rand -hex 12)

# create run using yaml file, --name is optional
Expand All @@ -106,7 +106,7 @@ pfazure run show-details --name $run_name
pfazure run show-metrics --name $run_name


# (Optional) save previous run name into variable, and create a new random run name for furthur use
# (Optional) save previous run name into variable, and create a new random run name for further use
prev_run_name=$run_name
run_name="classification_accuracy_"$(openssl rand -hex 12)

Expand Down
2 changes: 1 addition & 1 deletion examples/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# remove when we publish to pypi
--extra-index-url https://azuremlsdktestpypi.azureedge.net/promptflow/
promptflow[azure]==0.0.101903259
promptflow[azure]==0.0.102010695
promptflow-tools==0.1.0.b3
python-dotenv
langchain
Expand Down
4 changes: 3 additions & 1 deletion examples/tutorials/flow-deploy/.gitignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,4 @@
*.sqlite
linux/flow
linux/flow
linux/connections
linux/settings.json
3 changes: 0 additions & 3 deletions examples/tutorials/flow-deploy/deploy.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,6 @@ The following CLI commands allows you export a flow as a sharable folder with a
pf flow export --source ../../flows/standard/basic-with-connection --output <your-output-dir> --format docker
```

You'll be asked to input a migration secret when running this command, which needs to be provided when you run the built docker image.
You can also provide the key via `--migration-secret` directly or passing it with a file via `--migration-secret-file`.

More details about how to use the exported docker can be seen in `<your-output-dir>/README.md`.
Part of sample output are under [./linux](./linux/) so you can also check [this README](./linux/README.md) directly.

Expand Down
9 changes: 4 additions & 5 deletions examples/tutorials/flow-deploy/linux/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -11,15 +11,14 @@ ENV CONDA_DEFAULT_ENVIRONMENT=$CONDA_ENVIRONMENT_PATH
ENV PATH $CONDA_DEFAULT_ENVIRONMENT/bin:$PATH
RUN conda create -n promptflow-serve python=3.9.16 pip=23.0.1 -q -y && \
conda run -n promptflow-serve \
pip install -r /flow/requirements_txt && \
pip install -r /flow/requirements.txt && \
conda run -n promptflow-serve pip install keyrings.alt && \
conda run -n promptflow-serve pip cache purge && \
conda clean -a -y

RUN conda run -n promptflow-serve sh /flow/setup.sh

EXPOSE 8080

COPY ./connections.sqlite /
COPY ./connections_setup.py /
COPY ./connections/* /connections/
COPY ./start.sh /
CMD ./start.sh
CMD ["bash", "./start.sh"]
98 changes: 34 additions & 64 deletions examples/tutorials/flow-deploy/linux/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,12 @@
Exported Dockerfile & its dependencies are located in the same folder. The structure is as below:
- flow: the folder contains all the flow files
- ...
- connections: the folder contains yaml files to create all related connections
- ...
- Dockerfile: the dockerfile to build the image
- connections.sqlite: the sqlite database file to store the connections used in the flow
- connections_setup.py: the python script to migrate the connections used in the flow
- start.sh: the script used in `CMD` of `Dockerfile` to start the service
- docker-compose.yaml: a sample compose file to run the service
- deploy.sh & deploy.ps1: the script to deploy the docker image to Azure App Service
- settings.json: a json file to store the settings of the docker image
- README.md: the readme file to describe how to use the dockerfile

## Build Docker image
Expand All @@ -27,89 +28,58 @@ docker build . -t promptflow-serve
Run the docker image will start a service to serve the flow inside the container. Service will listen on port 8080.
You can map the port to any port on the host machine as you want.

If the service involves connections, you need to migrate the connections before the first request to the service.
Given api_key in connections are secrets, we have provided a way to migrate them with the `migration-secret` you have provided in export command.

### Manage migration-secret with `docker-secret`
If the service involves connections, all related connections will be exported as yaml files and recreated in containers.

As a pre-requirement of connections migration, you need to create a docker secret named `MIGRATION_SECRET`
to store the migration secret first. Sample command is like below:
Secrets in connections won't be exported directly. Instead, we will export them as a reference to environment variables:

```bash
#### Init host machine as a swarm manager
docker swarm init
#### Create a secret to store the migration secret
# You can create the docker secret directly from shell with bash
(read -sp "Enter your migration secret: "; echo $REPLY) | docker secret create MIGRATION_SECRET -
# or you can also create secret from a local file with migration secret as its content, like in Powershell
docker secret create MIGRATION_SECRET <migration_secret_file>
```yaml
configs:
AZURE_OPENAI_API_BASE: xxx
CHAT_DEPLOYMENT_NAME: xxx
module: promptflow.connections
name: custom_connection
secrets:
AZURE_OPENAI_API_KEY: ${env:<connection-name>_<secret-key>}
type: custom
```
You can check below documents for more details:
- [Swam mode overview](https://docs.docker.com/engine/swarm/)
- [Secrets management](https://docs.docker.com/engine/swarm/secrets/)
You'll need to set up the environment variables in the container to make the connections work.
### Run with `docker-service` [WIP]
### Deploy with Azure App Service
To avoid manually migrate the connections, you can use `docker-secret` to manage the migration secret
and `docker-service` to run the service:
Azure App Service is an HTTP-based service for hosting web applications, REST APIs, and mobile back ends.
Promptflow has provided a [script](./deploy.sh) to help deploy the docker image to Azure App Service.
Example command to use bash script:
```bash
#### Start the service
docker service create --name promptflow-service -p 8080:8080 --secret MIGRATION_SECRET promptflow-serve
bash deploy.sh -i <image_tag> -r "promptflow.azurecr.io" -g <resource_group>
```

You can check below documents for more details:
- [Run Docker Engine in swarm mode](https://docs.docker.com/engine/swarm/swarm-mode/)
Example command to use powershell script:
```powershell
.\deploy.ps1 -i <image_tag> -r "promptflow.azurecr.io" -g <resource_group>
```

See the full parameters by `bash deploy.sh -h` or `.\deploy.ps1 -h`.

### Run with `docker-compose`
### Run with `docker run`

You can also use `docker-secret` in your compose file and use compose file to start your service:
You can run the docker image directly set via below commands:

```bash
#### Deploy the service
docker stack deploy --compose-file=./docker-compose.yaml service1
docker run -p 8080:8080 -e <connection-name>_<secret-key>=<secret-value> promptflow-serve
```

Note that you need to deploy the service to a swarm cluster to use `docker-secret`.
So connections won't be migrated successfully if you run `docker-compose` directly.
More details can be found in the official document:
- [Deploy a stack to a swarm](https://docs.docker.com/engine/swarm/stack-deploy/)

In the sample compose file `docker-compose.yaml` in the output directory, we claim secret `MIGRATION_SECRET`
as external, which means you need to create the secret first before running the compose file.
As explain in previously, secrets in connections will be passed to container via environment variables.
You can set up multiple environment variables for multiple connection secrets:

You can also specify the migration secret file and docker image in the compose file:

```yaml
services:
promptflow:
image: <your-image>
...
secrets:
MIGRATION_SECRET:
file: <your-migration-secret-file>
```bash
docker run -p 8080:8080 -e <connection-name-1>_<secret-key>=<secret-value-1> -e <connection-name-2>_<secret-key>=<secret-value-2> promptflow-serve
```

Official document:
- [Manage secrets in Docker Compose](https://docs.docker.com/compose/compose-file/compose-file-v3/#secrets)
- [Using secrets in Compose](https://docs.docker.com/compose/use-secrets/)
## Test the endpoint
After start the service, you can use curl to test it:

```bash
curl http://localhost:8080/score --data '{"text":"Hello world!"}' -X POST -H "Content-Type: application/json"
```
## Advanced: Run with `docker run`

If you want to debug or have provided a wrong migration-secret, you can run the docker image directly and manually migrate the connections via below commands:

```bash
docker run -p 8080:8080 promptflow-serve
#### Migrate connections
docker exec -it <container_id> python connections_setup.py --file /connections.sqlite --migration-secret <migration_secret> --clean
```

Note that the command to migrate the connections must be run before any requests to the service.
52 changes: 0 additions & 52 deletions examples/tutorials/flow-deploy/linux/connections_setup.py

This file was deleted.

Loading

0 comments on commit 08e1867

Please sign in to comment.