RubyFlow The Ruby and Rails community linklog

×

The Ruby and Rails community linklog

Made a library? Written a blog post? Found a useful tutorial? Share it with the Ruby community here or just enjoy what everyone else has found!

llm.rb v4.11.0 released

llm.rb v4.11.0 introduces streaming tool execution—tools can start while the model is still responding, overlapping latency with output. It adds MCP support over both stdio and HTTP (with connection pooling), OpenAI’s Responses API, and a complete concurrency model with threads, fibers, and async tasks.

The release includes a local model registry for cost tracking, JSON Schema unions, and production fixes across providers, making it ready for real systems where control and performance matter.

Post a comment

You can use basic HTML markup (e.g. <a>) or Markdown.

As you are not logged in, you will be
directed via GitHub to signup or sign in