releaseintermediate
[Release] crewaiInc/crewAI 1.14.4a2: 1.14.4a2
By mattatchagithub
View original on githubcrewAI 1.14.4a2 introduces custom persistence key support in the @persist decorator, adds Azure OpenAI Responses API integration, and includes multiple bug fixes for LLM message handling, MCP server tool returns, instructor provider configuration, and Azure credential scopes. The release enhances flow persistence flexibility, improves Azure provider compatibility, and strengthens error handling across core components.
Key Points
- •Custom persistence keys now supported in @persist decorator for flexible flow state management
- •Azure OpenAI Responses API integration enables advanced response handling for Azure deployments
- •Fixed LLM non-streaming handlers to use validated messages variable, preventing potential message validation issues
- •MCP server integration improved with graceful handling when native servers return no tools
- •Instructor provider configuration now properly forwards base_url and api_key parameters
- •Azure AI Inference client now receives credential_scopes for enhanced authentication flexibility
- •New contributor @kunalk16 added Azure OpenAI Responses API support
Found this useful? Add it to a playbook for a step-by-step implementation guide.
Workflow Diagram
Start Process
Step A
Step B
Step C
Complete