Skip to content

Commit

Permalink
examples : remove whisper (#860)
Browse files Browse the repository at this point in the history
ggml-ci
  • Loading branch information
ggerganov committed Jun 16, 2024
1 parent 169738d commit ac1e9ae
Show file tree
Hide file tree
Showing 14 changed files with 1 addition and 10,388 deletions.
16 changes: 1 addition & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Some of the development is currently happening in the [llama.cpp](https://github

- [X] Example of GPT-2 inference [examples/gpt-2](https://github.com/ggerganov/ggml/tree/master/examples/gpt-2)
- [X] Example of GPT-J inference [examples/gpt-j](https://github.com/ggerganov/ggml/tree/master/examples/gpt-j)
- [X] Example of Whisper inference [examples/whisper](https://github.com/ggerganov/ggml/tree/master/examples/whisper)
- [X] Example of Whisper inference [ggerganov/whisper.cpp](https://github.com/ggerganov/whisper.cpp)
- [X] Example of LLaMA inference [ggerganov/llama.cpp](https://github.com/ggerganov/llama.cpp)
- [X] Example of LLaMA training [ggerganov/llama.cpp/examples/baby-llama](https://github.com/ggerganov/llama.cpp/tree/master/examples/baby-llama)
- [X] Example of Falcon inference [cmp-nct/ggllm.cpp](https://github.com/cmp-nct/ggllm.cpp)
Expand All @@ -44,20 +44,6 @@ Some of the development is currently happening in the [llama.cpp](https://github
- [X] Example of multiple LLMs inference [foldl/chatllm.cpp](https://github.com/foldl/chatllm.cpp)
- [X] SeamlessM4T inference *(in development)* https://github.com/facebookresearch/seamless_communication/tree/main/ggml

## Whisper inference (example)

With ggml you can efficiently run [Whisper](examples/whisper) inference on the CPU.

Memory requirements:

| Model | Disk | Mem |
| --- | --- | --- |
| tiny | 75 MB | ~280 MB |
| base | 142 MB | ~430 MB |
| small | 466 MB | ~1.0 GB |
| medium | 1.5 GB | ~2.6 GB |
| large | 2.9 GB | ~4.7 GB |

## GPT inference (example)

With ggml you can efficiently run [GPT-2](examples/gpt-2) and [GPT-J](examples/gpt-j) inference on the CPU.
Expand Down
34 changes: 0 additions & 34 deletions ci/run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -218,39 +218,6 @@ function gg_sum_mnist {
gg_printf '```\n'
}

# whisper

function gg_run_whisper {
cd ${SRC}

gg_wget models-mnt/whisper/ https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-base.en.bin
gg_wget models-mnt/whisper/ https://github.com/ggerganov/whisper.cpp/raw/master/samples/jfk.wav

cd build-ci-release

set -e

path_models="../models-mnt/whisper/"
model_f16="${path_models}/ggml-base.en.bin"
audio_0="${path_models}/jfk.wav"

(time ./bin/whisper -m ${model_f16} -f ${audio_0} ) 2>&1 | tee -a $OUT/${ci}-main.log

grep -q "And so my fellow Americans" $OUT/${ci}-main.log

set +e
}

function gg_sum_whisper {
gg_printf '### %s\n\n' "${ci}"

gg_printf 'Runs short Whisper transcription\n'
gg_printf '- status: %s\n' "$(cat $OUT/${ci}.exit)"
gg_printf '```\n'
gg_printf '%s\n' "$(cat $OUT/${ci}-main.log)"
gg_printf '```\n'
}

# sam

function gg_run_sam {
Expand Down Expand Up @@ -347,7 +314,6 @@ fi
if [ -z ${GG_BUILD_NO_DOWNLOAD} ]; then
test $ret -eq 0 && gg_run gpt_2
test $ret -eq 0 && gg_run mnist
test $ret -eq 0 && gg_run whisper
test $ret -eq 0 && gg_run sam
test $ret -eq 0 && gg_run yolo
fi
Expand Down
1 change: 0 additions & 1 deletion examples/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,6 @@ target_include_directories(common-ggml PUBLIC ${CMAKE_CURRENT_SOURCE_DIR})

add_subdirectory(gpt-2)
add_subdirectory(gpt-j)
add_subdirectory(whisper)
add_subdirectory(mnist)
add_subdirectory(sam)
add_subdirectory(yolo)
Expand Down
23 changes: 0 additions & 23 deletions examples/whisper/CMakeLists.txt

This file was deleted.

29 changes: 0 additions & 29 deletions examples/whisper/README.md

This file was deleted.

Loading

0 comments on commit ac1e9ae

Please sign in to comment.