Skip to content

Commit 83a3363

Browse files
authored
Merge pull request #30 from ajcwebdev/dev
`--info` option, `whisper.cpp` repo fallback, and Server `curl` Examples
2 parents 4825dfd + cba319a commit 83a3363

19 files changed

+929
-358
lines changed

docker-compose.yml

+9-12
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,10 @@ services:
1111
- /var/run/docker.sock:/var/run/docker.sock
1212
depends_on:
1313
- whisper
14-
# - llama
14+
- ollama
15+
environment:
16+
- OLLAMA_HOST=localhost
17+
- OLLAMA_PORT=11434
1518
whisper:
1619
build:
1720
context: ./whisper.cpp
@@ -22,15 +25,9 @@ services:
2225
command: tail -f /dev/null
2326
tty: true
2427
stdin_open: true
25-
# llama:
26-
# build:
27-
# context: ./llama.cpp
28-
# dockerfile: Dockerfile
29-
# volumes:
30-
# - ./content:/app/content
31-
# command: tail -f /dev/null
32-
# tty: true
33-
# stdin_open: true
28+
ollama:
29+
image: ollama/ollama
30+
ports:
31+
- "11434:11434"
3432
volumes:
35-
whisper:
36-
# llama:
33+
whisper:

docs/examples.md

+37-28
Original file line numberDiff line numberDiff line change
@@ -17,11 +17,12 @@
1717
- [OctoAI's Models](#octoais-models)
1818
- [Llama.cpp](#llamacpp)
1919
- [Transcription Options](#transcription-options)
20+
- [Whisper.cpp](#whispercpp)
2021
- [Deepgram](#deepgram)
2122
- [Assembly](#assembly)
22-
- [Whisper.cpp](#whispercpp)
23-
- [Docker Compose](#docker-compose)
24-
- [Alternative JavaScript Runtimes](#alternative-javascript-runtimes)
23+
- [Prompt Options](#prompt-options)
24+
- [Alternative Runtimes](#alternative-runtimes)
25+
- [Docker Compose](#docker-compose)
2526
- [Deno](#deno)
2627
- [Bun](#bun)
2728
- [Makeshift Test Suite](#makeshift-test-suite)
@@ -98,6 +99,12 @@ npm run as -- \
9899
--prompt titles summary longChapters takeaways questions
99100
```
100101

102+
Run on a podcast RSS feed and generate JSON info file with markdown metadata of each item:
103+
104+
```bash
105+
npm run as -- --rss "https://ajcwebdev.substack.com/feed" --info
106+
```
107+
101108
## Language Model (LLM) Options
102109

103110
Create a `.env` file and set API key as demonstrated in `.env.example` for either:
@@ -226,26 +233,6 @@ npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --llama
226233

227234
## Transcription Options
228235

229-
Create a `.env` file and set API key as demonstrated in `.env.example` for `DEEPGRAM_API_KEY` or `ASSEMBLY_API_KEY`.
230-
231-
### Deepgram
232-
233-
```bash
234-
npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --deepgram
235-
```
236-
237-
### Assembly
238-
239-
```bash
240-
npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --assembly
241-
```
242-
243-
Include speaker labels and number of speakers:
244-
245-
```bash
246-
npm run as -- --video "https://ajc.pics/audio/fsjam-short.mp3" --assembly --speakerLabels
247-
```
248-
249236
### Whisper.cpp
250237

251238
If neither the `--deepgram` or `--assembly` option is included for transcription, `autoshow` will default to running the largest Whisper.cpp model. To configure the size of the Whisper model, use the `--model` option and select one of the following:
@@ -273,15 +260,27 @@ Run `whisper.cpp` in a Docker container with `--whisperDocker`:
273260
npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --whisperDocker base
274261
```
275262

276-
## Docker Compose
263+
### Deepgram
277264

278-
This will run both `whisper.cpp` and the AutoShow Commander CLI in their own Docker containers.
265+
Create a `.env` file and set API key as demonstrated in `.env.example` for `DEEPGRAM_API_KEY`.
279266

280267
```bash
281-
docker-compose run autoshow --video "https://www.youtube.com/watch?v=MORMZXEaONk" --whisperDocker base
268+
npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --deepgram
282269
```
283270

284-
Currently working on the `llama.cpp` Docker integration so the entire project can be encapsulated in one local Docker Compose file.
271+
### Assembly
272+
273+
Create a `.env` file and set API key as demonstrated in `.env.example` for `ASSEMBLY_API_KEY`.
274+
275+
```bash
276+
npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --assembly
277+
```
278+
279+
Include speaker labels and number of speakers:
280+
281+
```bash
282+
npm run as -- --video "https://ajc.pics/audio/fsjam-short.mp3" --assembly --speakerLabels
283+
```
285284

286285
## Prompt Options
287286

@@ -339,7 +338,17 @@ Include all prompt options:
339338
npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --prompt titles summary longChapters takeaways questions
340339
```
341340

342-
## Alternative JavaScript Runtimes
341+
## Alternative Runtimes
342+
343+
### Docker Compose
344+
345+
This will run both `whisper.cpp` and the AutoShow Commander CLI in their own Docker containers.
346+
347+
```bash
348+
docker-compose run autoshow --video "https://www.youtube.com/watch?v=MORMZXEaONk" --whisperDocker base
349+
```
350+
351+
Currently working on the `llama.cpp` Docker integration so the entire project can be encapsulated in one local Docker Compose file.
343352

344353
### Bun
345354

0 commit comments

Comments
 (0)