v0.14.14
Summary
LlamaIndex version 0.14.14 is a maintenance release that fixes multiple bugs across core components and integrations, including issues with error handling in vector store queries, compatibility with deprecated Python functions, and empty responses from language models. The release also adds new features like a TokenBudgetHandler for cost governance and improves security defaults in core components. Several integrations with external services (OpenAI, Google Gemini, Anthropic, Bedrock) were updated to support new models and fix compatibility issues.
Solution / Mitigation
Users should update to version 0.14.14. The release notes explicitly mention: "Fix potential crashes and improve security defaults in core components (#20610)" and include specific bug fixes such as "fix(agent): handle empty LLM responses with retry logic" (#20596) and "Fix DeprecationWarning: 'asyncio.iscoroutinefunction' is deprecated" (#20517).
Classification
Affected Vendors
Related Issues
Original source: https://github.com/run-llama/llama_index/releases/tag/v0.14.14
First tracked: February 14, 2026 at 03:00 PM
Classified by LLM (prompt v3) · confidence: 75%