Skip to content

Releases: edgenai/llama_cpp-rs

llama_cpp_sys v0.2.2

08 Nov 16:56
Compare
Choose a tag to compare

Bug Fixes

  • do not rerun build on changed header files
    this restores functionality lost in the latest upgrade to bindgen, which enabled this functionality

Commit Statistics

  • 2 commits contributed to the release.
  • 1 commit was understood as conventional.
  • 0 issues like '(#ID)' were seen in commit messages

Commit Details

view details
  • Uncategorized
    • Do not rerun build on changed header files (674f395)
    • Release llama_cpp_sys v0.2.1, llama_cpp v0.1.1 (a9e5813)

llama_cpp_sys v0.2.1

08 Nov 12:09
Compare
Choose a tag to compare

Chore

  • Update to bindgen 0.69.1

Bug Fixes

  • start_completing should not be invoked on a per-iteration basis
    There's still some UB that can be triggered due to llama.cpp's threading model, which needs patching up.

Commit Statistics

  • 2 commits contributed to the release.
  • 13 days passed between releases.
  • 2 commits were understood as conventional.
  • 0 issues like '(#ID)' were seen in commit messages

Commit Details

view details
  • Uncategorized
    • Update to bindgen 0.69.1 (ccb794d)
    • start_completing should not be invoked on a per-iteration basis (4eb0bc9)

llama_cpp v0.1.3

08 Nov 13:51
Compare
Choose a tag to compare

New Features

  • more async function variants
  • add LlamaSession.model

Other

  • typo

Commit Statistics

  • 5 commits contributed to the release.
  • 3 commits were understood as conventional.
  • 0 issues like '(#ID)' were seen in commit messages

Commit Details

view details
  • Uncategorized
    • Typo (0a0d5f3)
    • Release llama_cpp v0.1.2 (4d0b130)
    • More async function variants (1019402)
    • Add LlamaSession.model (c190df6)
    • Release llama_cpp_sys v0.2.1, llama_cpp v0.1.1 (a9e5813)

llama_cpp v0.1.2

08 Nov 13:30
Compare
Choose a tag to compare

New Features

  • more async function variants
  • add LlamaSession.model

Commit Statistics

  • 2 commits contributed to the release.
  • 2 commits were understood as conventional.
  • 0 issues like '(#ID)' were seen in commit messages

Commit Details

view details
  • Uncategorized
    • More async function variants (dcfccdf)
    • Add LlamaSession.model (56285a1)

llama_cpp v0.1.1

08 Nov 12:09
Compare
Choose a tag to compare

Chore

  • Remove debug binary from Cargo.toml

New Features

  • add LlamaModel::load_from_file_async

Bug Fixes

  • require llama_context is accessed from behind a mutex
    This solves a race condition when several get_completions threads are spawned at the same time
  • start_completing should not be invoked on a per-iteration basis
    There's still some UB that can be triggered due to llama.cpp's threading model, which needs patching up.

Commit Statistics

  • 5 commits contributed to the release.
  • 13 days passed between releases.
  • 4 commits were understood as conventional.
  • 0 issues like '(#ID)' were seen in commit messages

Commit Details

view details
  • Uncategorized
    • Add LlamaModel::load_from_file_async (3bada65)
    • Remove debug binary from Cargo.toml (3eddbab)
    • Require llama_context is accessed from behind a mutex (b676baa)
    • start_completing should not be invoked on a per-iteration basis (4eb0bc9)
    • Update to llama.cpp 0a7c980 (94d7385)

llama_cpp_sys v0.2.0

25 Oct 13:37
Compare
Choose a tag to compare

Chore

  • Release
  • latest fixes from upstream

Bug Fixes

  • set clang to use c++ stl
  • use SPDX license identifiers

Other

  • use link-cplusplus, enable build+test on all branches
    • ci: disable static linking of llama.o\r
      \r
    • ci: build+test on all branches/prs\r
      \r
    • ci: use link-cplusplus
  • configure for cargo-release

Commit Statistics

  • 10 commits contributed to the release over the course of 5 calendar days.
  • 6 commits were understood as conventional.
  • 3 unique issues were worked on: #1, #2, #3

Commit Details

view details
  • #1
    • Use link-cplusplus, enable build+test on all branches (2d14d8d)
  • #2
    • Prepare for publishing to crates.io (f35e282)
  • #3
  • Uncategorized

llama_cpp v0.1.0

25 Oct 13:40
Compare
Choose a tag to compare

Chore

  • remove include from llama_cpp
  • Release
  • latest fixes from upstream

Chore

  • add CHANGELOG.md

Bug Fixes

  • use SPDX license identifiers

Other

  • configure for cargo-release

Commit Statistics

  • 8 commits contributed to the release over the course of 5 calendar days.
  • 6 commits were understood as conventional.
  • 1 unique issue was worked on: #3

Commit Details

view details