Skip to content

v1.6.0

Latest
Compare
Choose a tag to compare
@shahariaazam shahariaazam released this 16 Dec 18:27
· 6 commits to master since this release
bd0f478

Guti v1.6.0 Release Notes

Major Features

Enhanced LLM Support

  • Streaming Responses: Added streaming capability to LLM providers, allowing real-time token generation
    • New StreamingLLMResponse type for handling partial responses
    • Implemented streaming in OpenAI and Anthropic providers
    • Added context cancellation support for better resource management

Anthropic Integration Improvements

  • Enhanced Client Architecture: Completely refactored Anthropic client implementation
    • New AnthropicClient interface for better testability
    • Added RealAnthropicClient wrapper for official Anthropic SDK
    • Improved error handling and response processing
  • Streaming Support: Added comprehensive streaming support for Anthropic models

API Improvements

  • Better Encapsulation: Moved LLM provider dependency to request constructor
    • Simplified Generate() method signature
    • Improved state management and dependency injection
  • Reduced Code Duplication: Refactored OpenAI provider implementation
    • Extracted common message conversion logic
    • Centralized parameter creation

API Changes

New Types and Interfaces

  • Added StreamingLLMResponse for handling streaming responses
  • Introduced AnthropicClient interface for better abstraction
  • Extended LLMProvider interface with streaming capabilities

Modified Interfaces

type LLMProvider interface {
    GetResponse(messages []LLMMessage, config LLMRequestConfig) (LLMResponse, error)
    GetStreamingResponse(ctx context.Context, messages []LLMMessage, config LLMRequestConfig) (<-chan StreamingLLMResponse, error)
}

Constructor Changes

// Old
request := NewLLMRequest(config)

// New
request := NewLLMRequest(config, provider)

Documentation

  • Added comprehensive godoc comments with usage examples
  • Updated README.md with new features and examples
  • Improved code documentation across all providers

Testing

  • Added extensive test coverage for streaming functionality
  • Implemented comprehensive tests for Anthropic provider
  • Added mock implementations for better unit testing

Bug Fixes

  • Fixed code duplication in OpenAI provider
  • Improved error handling in streaming responses
  • Enhanced context cancellation handling

Compatibility Notes

  • All existing functionality remains backward compatible
  • New streaming features require Go 1.16 or later

Contributors

  • @shaharia-azam

License

This project is licensed under the MIT License - see the LICENSE file for details.