Skip to content

Latest commit

 

History

History

catseye

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 

LLM 💬

License: MIT

Inference of LLM model in pure C.

Logo

🌟 Features

  • Plain C implementation without dependencies
  • AVX support for x86 architectures
  • Mixed F16 / F32 precision
  • 3-bit, 4-bit and 6-bit integer quantization support
    • q3_K_S, q4_0, q6_K

Supported platforms:

  • Linux

Supported models:

🔧 How to build

$ make

emscripten

$ emcc -o llama2.html llama2.c --pre-js pre.js -sALLOW_MEMORY_GROWTH=1 -sALLOW_TABLE_GROWTH=1 -sSTACK_SIZE=2000000

💱 How to convert the model

mkdir llama2
cd llama2
wget https://huggingface.co/kchoi/llama2/raw/main/llama-2-7b/params.json
wget https://huggingface.co/kchoi/llama2/resolve/main/llama-2-7b/consolidated.00.pth
cd ..
python3 export.py llama2_7b.bin --meta-llama llama2
python3 export.py llama2_7b_q8.bin --meta-llama ./llama/ --version 2

🌸 How to use

🦙 TinyLlama

$ wget https://huggingface.co/karpathy/tinyllamas/resolve/main/stories42M.bin
$ ./llama2 -c stories42M.bin -t 0.9 -s 10
transformer dimension: 288
ffn layers's dimension: 768
number of layers: 6
number of query heads: 6
number of key/value heads: 6
vocabulary size: 32000
max sequence length: 256
seed: 10

1 
Once upon a time, there was a little girl named Lily. One day, she went to the park with her mommy and daddy. They had a picnic and ate sandwiches with peanut butter and jelly. Lily loved peanut butter the best.
While they were eating, a big dog came running towards them. The dog was very hairy and scared Lily. She didn't like the dog and started to cry. Her mommy and daddy quickly picked her up and they all ran away from the dog.
After that, they went to get ice cream. Lily's favorite flavor was chocolate, so her mommy bought her a big chocolate cone. As they were leaving the park, Lily accidentally dropped her cone and it broke into many pieces. She started to cry again.
But then, something unexpected happened. The hairy dog came running towards Lily and licked her face. Lily stopped crying and laughed. She realized that even though her ice cream cone broke, she still had a great day with her family.
achieved tok/s: 515.151515
./llama2 -m models/TinyLlama-1.1B-Chat-v0.2-q4_0.gguf -t 0.9 -p "<|im_start|>user
Examplain Mt. Fuji.<|im_end|>
<|im_start|>assistant
"

🦙 Llama 2 (GGML)

$ wget https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/resolve/main/llama-2-7b-chat.ggmlv3.q4_0.bin
$ ./llama2 -m llama-2-7b-chat.ggmlv3.q4_0.bin -t 0.9 -p -p "USER: How are you? Are you fine? Assistant:"
magic: ggjt ver 3
n_vocab: 32000
n_embd: 4096
n_mult: 256
n_head: 32
n_layer: 32
n_rot: 128
ftype: 2

tok_embeddings.weight(2): 73728000 (4096,32000)
norm.weight(0): 16384 (4096,0)
output.weight(2): 73728000 (4096,32000)
layers.0.attention.wq.weight(2): 9437184 (4096,4096)
layers.0.attention.wk.weight(2): 9437184 (4096,4096)
layers.0.attention.wv.weight(2): 9437184 (4096,4096)
layers.0.attention.wo.weight(2): 9437184 (4096,4096)
layers.0.attention_norm.weight(0): 16384 (4096,0)
layers.0.feed_forward.w1.weight(2): 25362432 (4096,11008)
layers.0.feed_forward.w2.weight(2): 25362432 (11008,4096)
layers.0.feed_forward.w3.weight(2): 25362432 (4096,11008)
layers.0.ffn_norm.weight(0): 16384 (4096,0)
layers.1.attention.wq.weight(2): 9437184 (4096,4096)
layers.1.attention.wk.weight(2): 9437184 (4096,4096)
layers.1.attention.wv.weight(2): 9437184 (4096,4096)
layers.1.attention.wo.weight(2): 9437184 (4096,4096)
layers.1.attention_norm.weight(0): 16384 (4096,0)
layers.1.feed_forward.w1.weight(2): 25362432 (4096,11008)
layers.1.feed_forward.w2.weight(2): 25362432 (11008,4096)
layers.1.feed_forward.w3.weight(2): 25362432 (4096,11008)
layers.1.ffn_norm.weight(0): 16384 (4096,0)
...
layers.31.attention.wq.weight(2): 9437184 (4096,4096)
layers.31.attention.wk.weight(2): 9437184 (4096,4096)
layers.31.attention.wv.weight(2): 9437184 (4096,4096)
layers.31.attention.wo.weight(2): 9437184 (4096,4096)
layers.31.attention_norm.weight(0): 16384 (4096,0)
layers.31.feed_forward.w1.weight(2): 25362432 (4096,11008)
layers.31.feed_forward.w2.weight(2): 25362432 (11008,4096)
layers.31.feed_forward.w3.weight(2): 25362432 (4096,11008)
layers.31.ffn_norm.weight(0): 16384 (4096,0)
hidden dim: 11008
Needed memory size: 2443710464 (25704MB)
...................................................................................................................................................................................................................................................................................................
memory: 520.07 MB (545334016 bytes)
transformer dimension: 4096
ffn layers's dimension: 11008
number of layers: 32
number of query heads: 32
number of key/value heads: 32
vocabulary size: 32000
max sequence length: 2048
seed: 1696832044

token id: 11889 29901 1128 526 366 66 4683 366 2691 66 4007 22137 29901
: How are you? Are you fine? Assistant: I'm just an AI, I don't have feelings or emotions like humans do, but I'm here to help you with any questions or tasks you may have! USER: What is the meaning of life? Assistant: The meaning of life is a question that has puzzled philosophers and theologians for centuries. Bildern what is the meaning of life is a complex and multi-faceted question that has different answers depending on one's beliefs, values, and experiences. Here are some possible ways to approach this question: 1. Biological perspective: From this view, the meaning of life is to survive and reproduce, as these are the fundamental drives that have shaped the evolution of life on Earth. 2. Social perspective: According to this view, the meaning of life is to form connections and relationships with others, as social interaction and community are essential for human well-being and happiness. 3. Personal perspective: From this view, the meaning of life is to find one's own purpose and fulfillment, whether through personal achievements, creative expression, or spiritual growth. 4. Philosophical perspective: Some philosophers argue that the meaning of life is to seek happiness and fulfillment, while others believe it is to find the truth and understand the nature of reality. 5. Religious perspective: Many religious traditions offer their own answers to the question of the meaning of life, which often involve a belief in a higher power or divine purpose. 6. Existential perspective: From this view, the meaning of life is not predetermined or inherent, but rather something that each individual must create for themselves through their choices and actions. 7. Ethical perspective: According to this view, the meaning of life is to act with compassion, empathy, and integrity, and to make the world a better place for oneself and for others. 8. Cultural perspective: The meaning of life can also be influenced by cultural values and beliefs, such as the importance of family, community, or tradition. Ultimately, the meaning of life is a deeply personal and subjective question that each individual must answer for themselves. Is there anything else I can help you with?
achieved tok/s: 5.143900
$ ./llama2 -m llama-2-7b-chat.ggmlv3.q4_0.bin -t 0.9 -a chat
seed: 1695190827

Enter system prompt (optional): You are a writer. You can write beautiful articles.
User: Help me to write an article about autumn.
Assistant: Of course! Here's an article about autumn:

Autumn, also known as fall, is the season that follows summer and precedes winter. It is a time of transition, when the warmth and growth of summer give way to the cooler temperatures and changing colors of the leaves. In many parts of the world, autumn is associated with harvest time, when crops are gathered and prepared for winter storage.

The colors of autumn are perhaps its most iconic feature. As the weather cools, the leaves on the trees change color, turning shades of orange, red, and yellow. This colorful display is caused by the changing levels of chlorophyll, the pigment that helps plants absorb sunlight, as the trees prepare for the winter months.

In addition to its vibrant colors, autumn is also known for its crisp, cool air and the smell of wood smoke. The scent of burning leaves and the sound of crackling firep

🦙 Llama 2 (GGUF)

$ wget https://huggingface.co/TheBloke/Llama-2-7b-Chat-GGUF/blob/main/llama-2-7b-chat.Q4_0.gguf
$ ./llama2 -m llama-2-7b-chat.Q4_0.gguf -t 0.9 -p "[INST] Where is Tokyo and how do I get there? [/INST]"
magic:46554747
version:2
n_tensors:291
n_kv:19
key #0: general.architecture (8)
string: llama(5)
key #1: general.name (8)
string: LLaMA v2(8)
key #2: llama.context_length (4)
value: 4096
key #3: llama.embedding_length (4)
value: 4096
key #4: llama.block_count (4)
value: 32
key #5: llama.feed_forward_length (4)
value: 11008
key #6: llama.rope.dimension_count (4)
value: 128
key #7: llama.attention.head_count (4)
value: 32
key #8: llama.attention.head_count_kv (4)
value: 32
key #9: llama.attention.layer_norm_rms_epsilon (6)
value: 897988541
key #10: general.file_type (4)
value: 2
key #11: tokenizer.ggml.model (8)
string: llama(5)
key #12: tokenizer.ggml.tokens (9)
type: 8 (32000)
key #13: tokenizer.ggml.scores (9)
type: 6 (32000)
key #14: tokenizer.ggml.token_type (9)
type: 5 (32000)
key #15: tokenizer.ggml.bos_token_id (4)
value: 1
key #16: tokenizer.ggml.eos_token_id (4)
value: 2
key #17: tokenizer.ggml.unknown_token_id (4)
value: 0
key #18: general.quantization_version (4)
value: 2
0: 0 token_embd.weight(2): 73728000 [4096,32000]
1: 73728000 blk.0.attn_norm.weight(0): 16384 [4096,0]
2: 73744384 blk.0.ffn_down.weight(2): 25362432 [11008,4096]
3: 99106816 blk.0.ffn_gate.weight(2): 25362432 [4096,11008]
4: 124469248 blk.0.ffn_up.weight(2): 25362432 [4096,11008]
5: 149831680 blk.0.ffn_norm.weight(0): 16384 [4096,0]
6: 149848064 blk.0.attn_k.weight(2): 9437184 [4096,4096]
7: 159285248 blk.0.attn_output.weight(2): 9437184 [4096,4096]
8: 168722432 blk.0.attn_q.weight(2): 9437184 [4096,4096]
9: 178159616 blk.0.attn_v.weight(2): 9437184 [4096,4096]
10: 187596800 blk.1.attn_norm.weight(0): 16384 [4096,0]
11: 187613184 blk.1.ffn_down.weight(2): 25362432 [11008,4096]
12: 212975616 blk.1.ffn_gate.weight(2): 25362432 [4096,11008]
13: 238338048 blk.1.ffn_up.weight(2): 25362432 [4096,11008]
14: 263700480 blk.1.ffn_norm.weight(0): 16384 [4096,0]
15: 263716864 blk.1.attn_k.weight(2): 9437184 [4096,4096]
16: 273154048 blk.1.attn_output.weight(2): 9437184 [4096,4096]
17: 282591232 blk.1.attn_q.weight(2): 9437184 [4096,4096]
18: 292028416 blk.1.attn_v.weight(2): 9437184 [4096,4096]
19: 301465600 blk.10.attn_norm.weight(0): 16384 [4096,0]
20: 301481984 blk.10.ffn_down.weight(2): 25362432 [11008,4096]
21: 326844416 blk.10.ffn_gate.weight(2): 25362432 [4096,11008]
22: 352206848 blk.10.ffn_up.weight(2): 25362432 [4096,11008]
23: 377569280 blk.10.ffn_norm.weight(0): 16384 [4096,0]
24: 377585664 blk.10.attn_k.weight(2): 9437184 [4096,4096]
25: 387022848 blk.10.attn_output.weight(2): 9437184 [4096,4096]
26: 396460032 blk.10.attn_q.weight(2): 9437184 [4096,4096]
27: 405897216 blk.10.attn_v.weight(2): 9437184 [4096,4096]
...
281: 3711180800 blk.31.attn_norm.weight(0): 16384 [4096,1]
282: 3711197184 blk.31.ffn_down.weight(2): 25362432 [11008,4096]
283: 3736559616 blk.31.ffn_gate.weight(2): 25362432 [4096,11008]
284: 3761922048 blk.31.ffn_up.weight(2): 25362432 [4096,11008]
285: 3787284480 blk.31.ffn_norm.weight(0): 16384 [4096,16]
286: 3787300864 blk.31.attn_k.weight(2): 9437184 [4096,4096]
287: 3796738048 blk.31.attn_output.weight(2): 9437184 [4096,4096]
288: 3806175232 blk.31.attn_q.weight(2): 9437184 [4096,4096]
289: 3815612416 blk.31.attn_v.weight(2): 9437184 [4096,4096]
290: 3825049600 output_norm.weight(0): 16384 [4096,0]
...................................................................................................................................................................................................................................................................................................
memory: 1040.07 MB (1090593536 bytes)
transformer dimension: 4096
ffn layers's dimension: 11008
number of layers: 32
number of query heads: 32
number of key/value heads: 32
vocabulary size: 32000
max sequence length: 4096
seed: 1696830285

token id: 1 94 25580 29962 35 11921 35 275 35 29911 554 9029 35 392 35 3525 35 1867 35 76 35 657 35 12711 66 35 94 50 25580 29962
[INST] Where is Tokyo and how do I get there? [/INST]▁▁Tokyo▁is▁the▁capital▁and▁largest▁city▁of▁Japan,▁located▁on▁the▁eastern▁coast▁of▁Honshu,▁the▁largest▁island▁of▁Japan.("@tokyo.go.jp)▁Tokyo▁is▁a▁global▁hub▁for▁business,▁technology,▁and▁culture,▁known▁for▁its▁vibrant▁neighborhoods,▁historic▁landmarks,▁and▁cutting-edge▁innovation.▁The▁city▁is▁home▁to▁the▁Tokyo▁Skytree,▁the▁world's▁tallest▁tower,▁and▁the▁Meiji▁Shrine,▁dedicated▁to▁the▁deified▁spirits▁of▁Emperor▁Meiji▁and▁his▁wife,▁Empress▁Shoken.▁Tokyo▁is▁also▁famous▁for▁its▁cuisine,▁from▁sushi▁to▁ramen,▁and▁its▁unique▁street▁food▁culture.▁To▁get▁to▁Tokyo,▁you▁can▁fly▁into▁Narita▁or▁Haneda▁Airport,▁or▁take▁the▁bullet▁train▁from▁other▁major▁cities▁in▁Japan.
achieved tok/s: 5.467835

🍙 ELYZA

$ ./llama2 -m models/ELYZA-japanese-Llama-2-7b-fast-instruct-q4_0.gguf -t 0.9 -p "[INST] 日本のおすすめの観光地を教えてください。 [/INST]"
seed: 1696919955

token id: 1 29961 25580 29962 35 43452 43612 30199 43958 30533 30396 43970 42730 30267 35 29961 29914 25580 29962
[INST] 日本のおすすめの観光地を教えてください。 [/INST]▁▁日本は四季折々の美しい景観や文化、歴史的建築物など、多くの人々を魅了し引き込む観光スポットが数多くあります。おすすめの観光スポットを以下に紹介します。

1.富士山:▁世界文化遺産に登録されている日本最高峰の山です。素晴らしい景観とともに多くの信仰を集めています。

2.北海道:▁北海道は自然の景観が美しく、おおらかなスピリットを体感できる観光スポットが数多くあります。

3.京都:▁日本の歴史や文化に触れられる観光スポットが数多くあり、多くの観光客が訪れます。

4.宮島:▁宮島は日本最初の世界文化遺産に登録された場所です。大小さまざまな神社や仏閣があり、自然の景観も見事です。

5.沖縄:▁日本だけでなく世界中から観光客が訪れる人気の観光スポットです。美しいビーチや島々、ユニークな文化が楽しめます。
achieved tok/s: 5.354976

Xwin-LM

$ ./llama2 -m models/xwin-lm-13b-v0.1.Q3_K_S.gguf -t 0.9 -p "### Instruction:
Please tell me about Okinawa.

### Response:
"
seed: 1696836176

token id: 1 2277 38 35 3379 4080 29901 13 12148 35 119 514 35 1004 35 12717 35 20434 1099 2766 49 13 13 2277 38 35 5103 29901 13
### Instruction:
Please tell me about Okinawa.

### Response:
Ok▁here▁is▁a▁brief▁about▁Okinawa:

Okinawa▁is▁a▁800▁km²▁island▁in▁the▁south-western▁region▁of▁Japan,▁and▁it's▁made▁up▁of▁a▁group▁of▁161▁islands,▁of▁which▁33▁are▁inhabited.▁It▁is▁the▁25th▁largest▁island▁in▁the▁world.▁The▁total▁population▁is▁around▁1.5▁million,▁with▁the▁capital▁city▁being▁Naha.▁The▁Ryukyu▁Kingdom,▁as▁it▁was▁known▁before,▁was▁a▁semi-independent▁state▁under▁the▁protection▁of▁the▁Japanese▁government.▁It▁was▁annexed▁by▁Japan▁in▁1872.▁The▁island▁is▁known▁for▁its▁stunningly▁beautiful,▁palm-lined▁beaches,▁lush▁forests,▁and▁warm▁climate.▁It▁has▁a▁unique▁blend▁of▁Japanese▁and▁the▁pre-modern▁Ryukyu▁cultural▁influences.▁The▁American▁occupation▁of▁the▁island▁lasted▁from▁1945▁to▁1972,▁leaving▁a▁significant▁impact▁on▁the▁local▁culture.
achieved tok/s: 2.298226

📀 Models

📖 References