Skip to content

v1.1.0

Compare
Choose a tag to compare
@nbaldwin98 nbaldwin98 released this 12 Apr 12:47
· 6 commits to main since this release

🚀 Release Notes - v1.1.0 🎉

We’re excited to announce the release of version 1.1.0 of our project! This release introduces significant enhancements to aiFlows, highlighted by the introduction of the Flows engine. This engine empowers concurrent execution and peer-to-peer distributed collaboration, revolutionizing the way you interact with your projects.

What's New? 🌟

  • Introducing Flows Engine: We introduce the Flows engine, powered by Colink, which enables concurrent execution and peer-to-peer distributed collaboration. Additionally, we redesigned the developer experience so that you can build Flows leveraging these features in the simplest way possible.

  • Redesigned Developer Experience: We've revamped the developer experience to make building Flows with these new features as simple as possible.

Key Features of Flows Engine: 🛠️

  • Serve Flows: Serve Flows for other users and yourself, fostering collaborative workflows across distributed teams.

  • Get Instances of a Served Flow and call it: Effortlessly obtain instances of a served flow via a proxy to be able to interact with it.

⚠️ Note: Backward Compatibility

Please be aware that due to the introduction of the Flows engine, this version is not backward compatible. Make sure to review and update your existing implementations accordingly before upgrading.

Resources for Further Exploration: 📚

For more detailed information on how to leverage the Flows engine and Colink, we encourage you to explore:

  • Tutorials Folder: Dive into the tutorials to get familiar with aiflows, its features, and how to use them in your projects.

  • Examples Folder: Explore our examples to see real-world use cases of the Flows engine and gain inspiration for your own projects.

Example: Pulling ChatAtomicFlow from FlowVerse, Serving it, and calling it.

import os
import aiflows
from aiflows.backends.api_info import ApiInfo
from aiflows.utils import colink_utils, serving
from aiflows import workers
from aiflows.utils.general_helpers import read_yaml_file, quick_load_api_keys

dependencies = [
    {"url": "aiflows/ChatFlowModule", "revision": "main"}
]
aiflows.flow_verse.sync_dependencies(dependencies)

if __name__ == "__main__":

    #1. ~~~ Start local colink server ~~~
    cl = colink_utils.start_colink_server()

    #2. ~~~ Load flow config ~~~
    root_dir = "flow_modules/aiflows/ChatFlowModule"
    cfg = read_yaml_file(os.path.join(root_dir, "demo.yaml"))

    ##2.1 ~~~ Set the API information ~~~
    api_information = [ApiInfo(backend_used="openai", api_key=os.getenv("OPENAI_API_KEY"))]
    quick_load_api_keys(cfg, api_information, key="api_infos")

    #3. ~~~~ Serve The Flow ~~~~
    flow_class_name="flow_modules.aiflows.ChatFlowModule.ChatAtomicFlow"
    serving.serve_flow(
        cl=cl, 
        flow_class_name=flow_class_name, 
        flow_endpoint="ChatAtomicFlow"
    )

    #4. ~~~~~Start A Worker Thread, Mount the Flow and Get its Proxy~~~~~
    workers.run_dispatch_worker_thread(cl)
    proxy_flow= serving.get_flow_instance(
        cl=cl, 
        flow_endpoint="ChatAtomicFlow", 
        user_id="local", 
        config_overrides=cfg
    )

    #5. ~~ Get the data ~~
    data = {"id": 0, "question": "What is the capital of Switzerland?"}
    input_message = proxy_flow.package_input_message(data = data)
    
    #6. ~~~ Run Inference ~~~
    future = proxy_flow.get_reply_future(input_message)
    reply_data = future.get_data()
    print("~~~~~Reply~~~~~")
    print(reply_data)

We're thrilled to bring these enhancements to our community and look forward to seeing the innovative ways you'll use them in your projects.

Happy coding! 🎈