Skip to content

Releases: atomiechen/HandyLLM

HandyLLM v0.9.2

09 Aug 01:30
d1a0415
Compare
Choose a tag to compare

Release HandyLLM v0.9.2.

Fixed

  • fix DictProxy YAML dump
  • fix tool calls in stream mode: missing last tool call

Changed

  • change import behavior:
    • __init__.py:
      • use explicit imports instead of *
      • limit imports from utils.py
      • remove all imports from response.py and types.py (please import them directly)
    • response.py: add missing items to __all__
  • add py.typed for better IDE support

HandyLLM v0.9.1

28 Jul 09:50
0c7a246
Compare
Choose a tag to compare

Release HandyLLM v0.9.1.

Added

  • 🔧 add lint dependency ruff
    • add scripts/lint.sh for linting
    • add scripts/format.sh for formatting, and format all files
    • add CI test workflow

Fixed

  • fix errors reported by ruff (E711, E722, F541)
  • fix TypeError on python 3.8 ('type' object is not subscriptable)

HandyLLM v0.9.0

27 Jul 17:51
6615411
Compare
Choose a tag to compare

Release HandyLLM v0.9.0.

Added

Check 🌟s for attractive new features!

  • hprompt.py:
    • 🌟 image_url in chat hprompt file now supports local path (file://), both absolute and relative
    • 🌟 add fetch(), afetch(), stream() and astream() methods for direct and typed API responses
    • add RunConfig.var_map_file_format for specifying variable map file format, including JSON / YAML; load_var_map() supports format param
  • requestor.py:
    • 🌟 add fetch(), afetch(), stream() and astream() methods for typed API responses
    • use generic and add DictRequestor, BinRequestor, ChatRequestor and CompletionsRequestor
  • OpenAIClient:
    • 🌟 constructor supports endpoint_manager, endpoints and load_path param; supports loading from YAML file and Mapping obj
    • APIs support endpoints param
    • APIs endpoint param supports Mapping type
  • 🌟 added cache_manager.py: CacheManager for general purpose caching to text files
    • add load_method and dump_method params
    • infers format from file suffix when convert handler is not provided
  • 🌟 added response.py: DictProxy for quick access to well-defined response dict
  • EndpointManager: supports loading from YAML file using endpoints key, or from Iterable obj
  • __init__.py import everything from hprompt for convenience
  • rename _types.py to types.py and expose all definitions
  • prompt_converter.py:
    • add generator sink consume_stream2fd()
  • utils.py:
    • add generator filter trans_stream_chat(), generator sink echo_consumer()
  • a lot improved type hints
  • added tests:
    • load prompt type specification
    • variable map substitution
    • ChatPrompt and CompletionsPrompt's API calls, supports for RunConfig.on_chunk, and addition operations
    • chat hprompt image_url
    • OpenAIClient loading, chat fetch() & stream()
    • endpoint_manager.py
    • cache_manager.py
    • audio speech
    • legacy OpenAIAPI

Changed

  • hprompt.py:
    • add 'endpoints' to default record blacklist
    • remove the var_map related configurations from the evaluated prompt, as it is already applied
  • EndpointManager:
    • raises ValueError when getting endpoint out of empty

Fixed

  • hprompt.py: 'type' object is not subscriptable on python 3.8

Removed

  • prompt_converter.py: remove stream_msgs2raw() and astream_msgs2raw() as no longer needed

HandyLLM v0.8.2

29 Jun 16:15
92f77a8
Compare
Choose a tag to compare

Release HandyLLM v0.8.2.

Added

  • hprompt: load methods now support cls parameter for prompt type specification
  • ChatPrompt and CompletionsPrompt support optional request and meta
  • ChatPrompt :
    • supports add dict
    • add add_message(...) method
  • CompletionsPrompt:
    • add add_text(...) method
  • PromptConverter: yaml.dump uses allow_unicode=True option
  • move all type definitions to _types.py
  • support for package development:
    • add requirement.txt for development
    • add scripts/test.sh for running tests
    • add test scripts in tests folder

Fixed

  • HandyPrompt.eval(...) should not make directories for output paths
  • CompletionsPrompt._run_with_client(...): misplaced run_config param
  • PromptConverter
    • fix variable replacement for content_array message
    • fix wrong return type of stream_msgs2raw and astream_msgs2raw
  • requestor:
    • httpx.Response should use reason_phrase to get error reason
    • acall() fix missing brackets for await
    • _call_raw() and _acall_raw() intercept and raise new exception without original one
    • _acall_raw(): read the response first to prevent httpx.ResponseNotRead before getting error message
  • _utils.exception2err_msg(...) should append error message instead of printing
  • change io.IOBase to IO[str] for file descriptors (e.g. RunConfig.output_fd)
  • fix other type hints

Changed

  • move all old files in tests folder to examples folder

HandyLLM v0.8.1

10 Jun 03:46
ec79112
Compare
Choose a tag to compare

Release HandyLLM v0.8.1.

Fixed

  • fix the debug print issue when outputting to a file in stream mode

HandyLLM v0.8.0

09 Jun 15:00
8b7f43a
Compare
Choose a tag to compare

Release HandyLLM v0.8.0.

Added

  • CLI: output to stderr without buffering
  • add RunConfig.output_path_buffering for controlling buffering of output file
  • add this changelog

Fixed

  • fix _post_check_output(...) not using evaluated run_config (may cause output_path or output_fd to be ignored)

Changed

  • rename internal constants to remove leading _ of API_xxx constants

Removed

  • remove unused files in deprecated folder

HandyLLM v0.7.6

24 May 16:03
fa798b4
Compare
Choose a tag to compare

Release HandyLLM v0.7.6.

Added

  • add RunConfig.on_chunk as callback for streamed chunks
  • add Azure tts example
  • add VM method to transform kwargs to % wrapped variable map dict
  • add var_map arg to eval(...), run(...) and arun(...) for convenience

Changed

  • merging different versions of var_map from method argument or from another RunConfig, instead of replacing it as a whole
  • rename RunConfig.to_dict's retain_fd arg to retain_object

HandyLLM v0.7.5

23 May 10:29
1e32509
Compare
Choose a tag to compare

Release HandyLLM v0.7.5.

Added

  • OpenAIClient add audio speech (tts) api support
    • add azure support for audio speech and transcriptions
  • add tts test script

Changed

  • prioritize RunConfig.output_evaled_prompt_fd over RunConfig.output_evaled_prompt_path
  • eval(...)
    • always return a new object
    • gives run_config arg a default value
    • accepts kwargs, same as run(...)
  • when dumping, always filter request
  • credential file do not overwrite existing request args

Fixed

  • non-stream mode prioritize RunConfig.output_fd over RunConfig.output_path

HandyLLM v0.7.4

22 May 10:14
4aa2d1b
Compare
Choose a tag to compare

Release HandyLLM v0.7.4.

HandyLLM v0.7.3

22 May 07:29
eb4f69c
Compare
Choose a tag to compare

Release HandyLLM v0.7.3.