llm_cost_tracker 0.7.2: Rails-native LLM cost tracking, now more production-ready
Released llm_cost_tracker v0.7.2
Main updates:
* Durable ActiveRecord ingestion with retries, quarantined bad rows, event IDs, and safer rollups.
* Refreshed official price snapshots, including Groq production text models.
* Better pricing for cache tokens, reasoning tokens, provider tiers, and pricing modes.
* Auto-capture coverage for OpenAI, Anthropic, RubyLLM, Gemini REST, OpenRouter, DeepSeek, Groq, and OpenAI-compatible proxies.
* Streaming tags are now snapshotted when a stream starts, so user/tenant/feature attribution is preserved.
Still the same idea: self-hosted Rails cost tracking, no proxy, no prompt/response storage, just provider/model/tokens/cost/latency/tags in your own DB.
GitHub: https://github.com/sergey-homenko/llm_cost_tracker
RubyGems: https://rubygems.org/gems/llm_cost_tracker
Post a comment