Skip to content

Latest commit

 

History

History
30 lines (21 loc) · 875 Bytes

README.md

File metadata and controls

30 lines (21 loc) · 875 Bytes

🦙🌲🤏 Alpaca-LoRA && Starwhale

This repo is forked from tloen/alpaca-lora, see the original README.md. We add some Starwhale support for this repo, users could manage lifecycle of the model/dataset by starwhale, including:

  • finetune a new version of model locally or remotelly and get a finetuned version of model
  • serve an API locally or remotelly
  • evaluate the model with Starwhale datasets

Build Starwhale Datasets

python build_swds.py

Build Starwhale Model

python build_swmp.py

Finetune the model with dataset and gain a new verson

swcli model run -u llama-7b-hf/version/latest -d test/version/latest -h swmp_handlers:fine_tune

Serve an API for the finetuned version of model

swcli model serve -u llama-7b-hf/version/latest