Skip to content

Releases: epfl-dlab/aiflows

v1.1.1

30 Apr 08:13
Compare
Choose a tag to compare

🚀 Release Notes - v1.1.1

Fixes

  • Resolved issue with message ID passing in FlowMessages.

Compatibility

  • This version is fully compatible with v1.1.0 of the library.

Note on FunSearch

  • To utilize FunSearch from the FlowVerse, ensure that you're using aiflow version 1.1.1 or higher.

v1.1.0

12 Apr 12:47
Compare
Choose a tag to compare

🚀 Release Notes - v1.1.0 🎉

We’re excited to announce the release of version 1.1.0 of our project! This release introduces significant enhancements to aiFlows, highlighted by the introduction of the Flows engine. This engine empowers concurrent execution and peer-to-peer distributed collaboration, revolutionizing the way you interact with your projects.

What's New? 🌟

  • Introducing Flows Engine: We introduce the Flows engine, powered by Colink, which enables concurrent execution and peer-to-peer distributed collaboration. Additionally, we redesigned the developer experience so that you can build Flows leveraging these features in the simplest way possible.

  • Redesigned Developer Experience: We've revamped the developer experience to make building Flows with these new features as simple as possible.

Key Features of Flows Engine: 🛠️

  • Serve Flows: Serve Flows for other users and yourself, fostering collaborative workflows across distributed teams.

  • Get Instances of a Served Flow and call it: Effortlessly obtain instances of a served flow via a proxy to be able to interact with it.

⚠️ Note: Backward Compatibility

Please be aware that due to the introduction of the Flows engine, this version is not backward compatible. Make sure to review and update your existing implementations accordingly before upgrading.

Resources for Further Exploration: 📚

For more detailed information on how to leverage the Flows engine and Colink, we encourage you to explore:

  • Tutorials Folder: Dive into the tutorials to get familiar with aiflows, its features, and how to use them in your projects.

  • Examples Folder: Explore our examples to see real-world use cases of the Flows engine and gain inspiration for your own projects.

Example: Pulling ChatAtomicFlow from FlowVerse, Serving it, and calling it.

import os
import aiflows
from aiflows.backends.api_info import ApiInfo
from aiflows.utils import colink_utils, serving
from aiflows import workers
from aiflows.utils.general_helpers import read_yaml_file, quick_load_api_keys

dependencies = [
    {"url": "aiflows/ChatFlowModule", "revision": "main"}
]
aiflows.flow_verse.sync_dependencies(dependencies)

if __name__ == "__main__":

    #1. ~~~ Start local colink server ~~~
    cl = colink_utils.start_colink_server()

    #2. ~~~ Load flow config ~~~
    root_dir = "flow_modules/aiflows/ChatFlowModule"
    cfg = read_yaml_file(os.path.join(root_dir, "demo.yaml"))

    ##2.1 ~~~ Set the API information ~~~
    api_information = [ApiInfo(backend_used="openai", api_key=os.getenv("OPENAI_API_KEY"))]
    quick_load_api_keys(cfg, api_information, key="api_infos")

    #3. ~~~~ Serve The Flow ~~~~
    flow_class_name="flow_modules.aiflows.ChatFlowModule.ChatAtomicFlow"
    serving.serve_flow(
        cl=cl, 
        flow_class_name=flow_class_name, 
        flow_endpoint="ChatAtomicFlow"
    )

    #4. ~~~~~Start A Worker Thread, Mount the Flow and Get its Proxy~~~~~
    workers.run_dispatch_worker_thread(cl)
    proxy_flow= serving.get_flow_instance(
        cl=cl, 
        flow_endpoint="ChatAtomicFlow", 
        user_id="local", 
        config_overrides=cfg
    )

    #5. ~~ Get the data ~~
    data = {"id": 0, "question": "What is the capital of Switzerland?"}
    input_message = proxy_flow.package_input_message(data = data)
    
    #6. ~~~ Run Inference ~~~
    future = proxy_flow.get_reply_future(input_message)
    reply_data = future.get_data()
    print("~~~~~Reply~~~~~")
    print(reply_data)

We're thrilled to bring these enhancements to our community and look forward to seeing the innovative ways you'll use them in your projects.

Happy coding! 🎈

v.0.1.7

27 Dec 17:14
Compare
Choose a tag to compare

What's Changed

  • Small fix in the tutorial by @tohrnii in #5
  • expand hugginface reponame matching logistics to support '-' by @Tachi-67 in #8
  • Feature/add api configuration helper and remove unused helper functions by @Tachi-67 in #10

New Contributors

Full Changelog: v0.1.6...v0.1.7

v0.1.6

14 Dec 14:50
1e5f3e9
Compare
Choose a tag to compare

Fix for #2
aiflows is now compatible with python 3.11 and 3.12.

It's worth noting that installing aiflows on Python 3.12 is not recommended. Some flows from the FlowVerse, such as the VectorStoreFlowModule used in AutoGPTFlow, may have dependencies that are incompatible with Python 3.12. For instance, the dependency to chromadb within VectorStoreFlowModule appears to be unattainable in Python 3.12 due to its pulsar-client dependency.

First public release

10 Dec 01:32
Compare
Choose a tag to compare
v0.1.5

Fixing typos and versioning.