Skip to content

Latest commit

 

History

History
87 lines (60 loc) · 2.81 KB

File metadata and controls

87 lines (60 loc) · 2.81 KB

Flow with symlinks

User sometimes need to reference some common files or folders, this sample demos how to solve the problem using symlinks. But it has the following limitations. It is recommended to use additional include. Learn more: flow-with-additional-includes

  1. For Windows user, by default need Administrator role to create symlinks.
  2. For Windows user, directly copy the folder with symlinks, it will deep copy the contents to the location.
  3. Need to update the git config to support symlinks.

Notes:

  • For Windows user, please grant user permission to create symbolic links without administrator role.
    1. Open your Local Security Policy
    2. Find Local Policies -> User Rights Assignment -> Create symbolic links
    3. Add you user name to this policy then reboot the compute.

Attention:

  • For git operations, need to set: git config core.symlinks true

Tools used in this flow

  • LLM Tool
  • Python Tool

What you will learn

In this flow, you will learn

  • how to use symlinks in the flow

Prerequisites

Install promptflow sdk and other dependencies:

pip install -r requirements.txt

Getting Started

1. Create symbolic links in the flow

python ./create_symlinks.py

2. Test & run the flow with symlinks

In this sample, this flow will references some files in the web-classification flow, and assume you already have required connection setup. You can execute this flow or submit it to cloud.

Test flow with single line data

# test flow with default input value in flow.dag.yaml
pf flow test --flow .

# test flow with input
pf flow test --flow . --inputs url=https://www.youtube.com/watch?v=o5ZQyXaAv1g answer=Channel evidence=Url

# test node in the flow
pf flow test --flow . --node convert_to_dict --inputs classify_with_llm.output='{"category": "App", "evidence": "URL"}'

Run with multi-line data

# create run using command line args
pf run create --flow . --data ./data.jsonl --stream
# create run using yaml file
pf run create --file run.yml --stream

Submit run to cloud

# create run
pfazure run create --flow . --data ./data.jsonl --stream --runtime demo-mir --subscription <your_subscription_id> -g <your_resource_group_name> -w <your_workspace_name>
# pfazure run create --flow . --data ./data.jsonl --stream # automatic runtime

# set default workspace
az account set -s <your_subscription_id>
az configure --defaults group=<your_resource_group_name> workspace=<your_workspace_name>

pfazure run create --file run.yml --runtime demo-mir --stream
# pfazure run create --file run.yml --stream # automatic runtime