|
17 | 17 | - [OctoAI's Models](#octoais-models)
|
18 | 18 | - [Llama.cpp](#llamacpp)
|
19 | 19 | - [Transcription Options](#transcription-options)
|
| 20 | + - [Whisper.cpp](#whispercpp) |
20 | 21 | - [Deepgram](#deepgram)
|
21 | 22 | - [Assembly](#assembly)
|
22 |
| - - [Whisper.cpp](#whispercpp) |
23 |
| -- [Docker Compose](#docker-compose) |
24 |
| -- [Alternative JavaScript Runtimes](#alternative-javascript-runtimes) |
| 23 | +- [Prompt Options](#prompt-options) |
| 24 | +- [Alternative Runtimes](#alternative-runtimes) |
| 25 | + - [Docker Compose](#docker-compose) |
25 | 26 | - [Deno](#deno)
|
26 | 27 | - [Bun](#bun)
|
27 | 28 | - [Makeshift Test Suite](#makeshift-test-suite)
|
@@ -98,6 +99,12 @@ npm run as -- \
|
98 | 99 | --prompt titles summary longChapters takeaways questions
|
99 | 100 | ```
|
100 | 101 |
|
| 102 | +Run on a podcast RSS feed and generate JSON info file with markdown metadata of each item: |
| 103 | + |
| 104 | +```bash |
| 105 | +npm run as -- --rss "https://ajcwebdev.substack.com/feed" --info |
| 106 | +``` |
| 107 | + |
101 | 108 | ## Language Model (LLM) Options
|
102 | 109 |
|
103 | 110 | Create a `.env` file and set API key as demonstrated in `.env.example` for either:
|
@@ -226,26 +233,6 @@ npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --llama
|
226 | 233 |
|
227 | 234 | ## Transcription Options
|
228 | 235 |
|
229 |
| -Create a `.env` file and set API key as demonstrated in `.env.example` for `DEEPGRAM_API_KEY` or `ASSEMBLY_API_KEY`. |
230 |
| - |
231 |
| -### Deepgram |
232 |
| - |
233 |
| -```bash |
234 |
| -npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --deepgram |
235 |
| -``` |
236 |
| - |
237 |
| -### Assembly |
238 |
| - |
239 |
| -```bash |
240 |
| -npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --assembly |
241 |
| -``` |
242 |
| - |
243 |
| -Include speaker labels and number of speakers: |
244 |
| - |
245 |
| -```bash |
246 |
| -npm run as -- --video "https://ajc.pics/audio/fsjam-short.mp3" --assembly --speakerLabels |
247 |
| -``` |
248 |
| - |
249 | 236 | ### Whisper.cpp
|
250 | 237 |
|
251 | 238 | If neither the `--deepgram` or `--assembly` option is included for transcription, `autoshow` will default to running the largest Whisper.cpp model. To configure the size of the Whisper model, use the `--model` option and select one of the following:
|
@@ -273,15 +260,27 @@ Run `whisper.cpp` in a Docker container with `--whisperDocker`:
|
273 | 260 | npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --whisperDocker base
|
274 | 261 | ```
|
275 | 262 |
|
276 |
| -## Docker Compose |
| 263 | +### Deepgram |
277 | 264 |
|
278 |
| -This will run both `whisper.cpp` and the AutoShow Commander CLI in their own Docker containers. |
| 265 | +Create a `.env` file and set API key as demonstrated in `.env.example` for `DEEPGRAM_API_KEY`. |
279 | 266 |
|
280 | 267 | ```bash
|
281 |
| -docker-compose run autoshow --video "https://www.youtube.com/watch?v=MORMZXEaONk" --whisperDocker base |
| 268 | +npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --deepgram |
282 | 269 | ```
|
283 | 270 |
|
284 |
| -Currently working on the `llama.cpp` Docker integration so the entire project can be encapsulated in one local Docker Compose file. |
| 271 | +### Assembly |
| 272 | + |
| 273 | +Create a `.env` file and set API key as demonstrated in `.env.example` for `ASSEMBLY_API_KEY`. |
| 274 | + |
| 275 | +```bash |
| 276 | +npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --assembly |
| 277 | +``` |
| 278 | + |
| 279 | +Include speaker labels and number of speakers: |
| 280 | + |
| 281 | +```bash |
| 282 | +npm run as -- --video "https://ajc.pics/audio/fsjam-short.mp3" --assembly --speakerLabels |
| 283 | +``` |
285 | 284 |
|
286 | 285 | ## Prompt Options
|
287 | 286 |
|
@@ -339,7 +338,17 @@ Include all prompt options:
|
339 | 338 | npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --prompt titles summary longChapters takeaways questions
|
340 | 339 | ```
|
341 | 340 |
|
342 |
| -## Alternative JavaScript Runtimes |
| 341 | +## Alternative Runtimes |
| 342 | + |
| 343 | +### Docker Compose |
| 344 | + |
| 345 | +This will run both `whisper.cpp` and the AutoShow Commander CLI in their own Docker containers. |
| 346 | + |
| 347 | +```bash |
| 348 | +docker-compose run autoshow --video "https://www.youtube.com/watch?v=MORMZXEaONk" --whisperDocker base |
| 349 | +``` |
| 350 | + |
| 351 | +Currently working on the `llama.cpp` Docker integration so the entire project can be encapsulated in one local Docker Compose file. |
343 | 352 |
|
344 | 353 | ### Bun
|
345 | 354 |
|
|
0 commit comments