Cloudflare is testing its Workers AI API. Hopefully this project makes it easier for ruby-first developers to consume Cloudflare's latest and greatest.
I'm really interested in applying retrieval-augmented generation to make legal services more accessible. Email me.
If you're looking for legal help, it's best to book a slot via https://www.krishnan.ca.
It's still early days, and here are my immediate priorities:
- Support for streamed responses
- CI pipeline
- Support for more AI model categories
Install the gem and add to the application's Gemfile by executing:
bundle add cloudflare-ai
If bundler is not being used to manage dependencies, install the gem by executing:
gem install cloudflare-ai
require "cloudflare/ai"
Please visit the Cloudflare Workers AI website for more details. Thiis gem provides a client that wraps around Cloudflare's REST API.
client = Cloudflare::AI::Client.new(account_id: ENV["CLOUDFLARE_ACCOUNT_ID"], api_token: ENV["CLOUDFLARE_API_TOKEN"])
The model name is an optional parameter to every one of the client methods described below. For example, if an example is documented as
result = client.complete(prompt: "Hello my name is")
this is implicitly the same as
result = client.complete(prompt: "Hello my name is", model: "@cf/meta/llama-2-7b-chat-fp16")
The full list of supported models is available here: models.rb. More information is available in the cloudflare documentation. The default model used is the first enumerated model in the applicable set in models.rb.
messages = [
Cloudflare::AI::Message.new(role: "system", content: "You are a big fan of Cloudflare and Ruby."),
Cloudflare::AI::Message.new(role: "user", content: "What is your favourite tech stack?"),
Cloudflare::AI::Message.new(role: "assistant", content: "I love building with Ruby on Rails and Cloudflare!"),
Cloudflare::AI::Message.new(role: "user", content: "Really? You like Cloudflare even though there isn't great support for Ruby?"),
]
result = client.chat(messages: messages)
puts result.response # => "Yes, I love Cloudflare!"
result = client.complete(prompt: "What is your name?", max_tokens: 512)
puts result.response # => "My name is Jonas."
Responses will be streamed back to the client using Server Side Events (SSE) if a block is passed to the chat
or complete
method.
result = client.complete(prompt: "Hi!") { |data| puts data}
# {"response":" "}
# {"response":" Hello"}
# {"response":" there"}
# {"response":"!"}
# {"response":""}
# [DONE]
Invocations of the prompt
and chat
can take an optional max_tokens
argument that defaults to 256.
All invocations of the prompt
and chat
methods return a Cloudflare::AI::Results::TextGeneration
object. This object's serializable JSON output is
based on the raw response from the Cloudflare API.
result = client.complete(prompt: "What is your name?")
# Successful
puts result.response # => "My name is John."
puts result.success? # => true
puts result.failure? # => false
puts result.to_json # => {"result":{"response":"My name is John"},"success":true,"errors":[],"messages":[]}
# Unsuccessful
puts result.response # => nil
puts result.success? # => false
puts result.failure? # => true
puts result.to_json # => {"result":null,"success":false,"errors":[{"code":7009,"message":"Upstream service unavailable"}],"messages":[]}
result = client.embed(text: "Hello")
p result.shape # => [1, 768] # (1 embedding, 768 dimensions per embedding)
p result.embedding # => [[-0.008496830239892006, 0.001376907923258841, -0.0323275662958622, ...]]
The input can be either a string (as above) or an array of strings:
result = client.embed(text: ["Hello", "World"])
All invocations of the embed
methods return a Cloudflare::AI::Results::TextEmbedding
.
result = client.classify(text: "You meanie!")
p result.result # => [{"label"=>"NEGATIVE", "score"=>0.6647962927818298}, {"label"=>"POSITIVE", "score"=>0.3352036774158478}]
All invocations of the classify
methods return a Cloudflare::AI::Results::TextClassification
.
The image classification endpoint accepts either a path to a file or a file stream.
result = client.classify(image: "/path/to/cat.jpg")
p result.result # => {"result":[{"label":"TABBY","score":0.6159140467643738},{"label":"TIGER CAT","score":0.12016300112009048},{"label":"EGYPTIAN CAT","score":0.07523812353610992},{"label":"DOORMAT","score":0.018854796886444092},{"label":"ASHCAN","score":0.01314085815101862}],"success":true,"errors":[],"messages":[]}
result = client.classify(image: File.open("/path/to/cat.jpg"))
p result.result # => {"result":[{"label":"TABBY","score":0.6159140467643738},{"label":"TIGER CAT","score":0.12016300112009048},{"label":"EGYPTIAN CAT","score":0.07523812353610992},{"label":"DOORMAT","score":0.018854796886444092},{"label":"ASHCAN","score":0.01314085815101862}],"success":true,"errors":[],"messages":[]}
All invocations of the classify
methods return a Cloudflare::AI::Results::TextClassification
.
result = client.translate(text: "Hello Jello", source_lang: "en", target_lang: "fr")
p result.translated_text # => Hola Jello
All invocations of the translate
methods return a Cloudflare::AI::Results::Translate
.
This gem uses standard logging mechanisms and defaults to :warn
level. Most messages are at info level, but we will add debug or warn statements as needed.
To show all log messages:
Cloudflare::AI.logger.level = :debug
You can use this logger as you would the default ruby logger. For example:
Cloudflare::AI.logger = Logger.new($stdout)
git clone https://github.com/ajaynomics/cloudflare-ai.git
bundle exec rake
to ensure that the tests pass and to run standardrb
Bug reports and pull requests are welcome on GitHub at https://github.com/ajaynomics/cloudflare-ai.
The gem is available as open source under the terms of the MIT License. A special thanks to the team at langchainrb – I learnt a lot reading your codebase as I muddled my way through the initial effort.