Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

automatic chat with function call, chat history no function call record #601

Open
weedge opened this issue Oct 16, 2024 · 3 comments
Open
Assignees
Labels
component:python sdk Issue/PR related to Python SDK good first issue Good for newcomers type:help Support-related issues

Comments

@weedge
Copy link

weedge commented Oct 16, 2024

Description of the bug:

def run_auto_function_calling():
    """
    Function calls naturally fit in to [multi-turn chats](https://ai.google.dev/api/python/google/generativeai/GenerativeModel#multi-turn) as they capture a back and forth interaction between the user and model. The Python SDK's [`ChatSession`](https://ai.google.dev/api/python/google/generativeai/ChatSession) is a great interface for chats because handles the conversation history for you, and using the parameter `enable_automatic_function_calling` simplifies function calling even further
    """
    model = genai.GenerativeModel(
        model_name="gemini-1.5-flash-latest",
        tools=[
            add,
            subtract,
            multiply,
            divide],
        system_instruction="You are a helpful assistant who converses with a user and answers questions. Respond concisely to general questions. ",
    )
    chat = model.start_chat(enable_automatic_function_calling=True)
    response = chat.send_message(
        [
            # "what's your name?",
            "I have 57 cats, each owns 44 mittens, how many mittens is that in total?",
        ],
        # stream=True, # enable_automatic_function_calling=True, unsupport stream
    )
    print(f"run_auto_function_calling response: {response}")
    for content in chat.history:
        print(content.role, "->", [type(part).to_dict(part) for part in content.parts])
        print("-" * 80)

Actual vs expected behavior:

result:

user -> [{'text': 'I have 57 cats, each owns 44 mittens, how many mittens is that in total?'}]
--------------------------------------------------------------------------------
model -> [{'function_call': {'name': 'multiply', 'args': {'a': 57.0, 'b': 44.0}}}]
--------------------------------------------------------------------------------
user -> [{'function_response': {'name': 'multiply', 'response': {'result': 2508.0}}}]
--------------------------------------------------------------------------------
model -> [{'text': "That's 2508 mittens! \n"}]
--------------------------------------------------------------------------------

open "what's your name?",
result:

user -> [{'text': "what's your name?"}, {'text': 'I have 57 cats, each owns 44 mittens, how many mittens is that in total?'}]
--------------------------------------------------------------------------------
model -> [{'text': "I don't have a name.  That's 2508 mittens. \n"}]
--------------------------------------------------------------------------------

no function_call history

Any other information you'd like to share?

pip show google-generativeai

Name: google-generativeai
Version: 0.8.3
Summary: Google Generative AI High level API client library and tools.
Home-page: https://github.com/google/generative-ai-python
Author: Google LLC
Author-email: googleapis-packages@google.com
License: Apache 2.0
Location: /Users/wuyong/project/python/chat-bot/.venv_achatbot/lib/python3.11/site-packages
Requires: google-ai-generativelanguage, google-api-core, google-api-python-client, google-auth, protobuf, pydantic, tqdm, typing-extensions
Required-by:
@Gunand3043 Gunand3043 added status:triaged Issue/PR triaged to the corresponding sub-team component:python sdk Issue/PR related to Python SDK type:help Support-related issues labels Oct 16, 2024
@Gunand3043 Gunand3043 self-assigned this Oct 16, 2024
@Gunand3043
Copy link

Hi @weedge , The issue is that sometimes the model fails to utilize the provided tools for a given query.
You can force the model to use the tools. Please refer the example below:

from google.generativeai.types import content_types
from collections.abc import Iterable


def tool_config_from_mode(mode: str, fns: Iterable[str] = ()):
    """Create a tool config with the specified function calling mode."""
    return content_types.to_tool_config(
        {"function_calling_config": {"mode": mode, "allowed_function_names": fns}}
    )
     

def run_auto_function_calling():
    """
    Function calls naturally fit in to [multi-turn chats](https://ai.google.dev/api/python/google/generativeai/GenerativeModel#multi-turn) as they capture a back and forth interaction between the user and model. The Python SDK's [`ChatSession`](https://ai.google.dev/api/python/google/generativeai/ChatSession) is a great interface for chats because handles the conversation history for you, and using the parameter `enable_automatic_function_calling` simplifies function calling even further
    """
    model = genai.GenerativeModel(
        model_name="gemini-1.5-flash-latest",
        tools=[
            add,
            subtract,
            multiply,
            divide],
        system_instruction="You are a helpful assistant who converses with a user and answers questions. Respond concisely to general questions. ",
    )
    fxn_tools=["add", "subtract", "multiply", "divide"]
    tool_config = tool_config_from_mode("any", fxn_tools)
    chat = model.start_chat(enable_automatic_function_calling=True)
    response = chat.send_message(
        [
            "what's your name?",
            "I have 57 cats, each owns 44 mittens, how many mittens is that in total?",
        ],
        tool_config=tool_config
        # stream=True, # enable_automatic_function_calling=True, unsupport stream
    )
    #print(f"run_auto_function_calling response: {response}")
    for content in chat.history:
        print(content.role, "->", [type(part).to_dict(part) for part in content.parts])
        print("-" * 80)

@Gunand3043 Gunand3043 added status:awaiting user response Awaiting a response from the author and removed status:triaged Issue/PR triaged to the corresponding sub-team labels Oct 16, 2024
Copy link

Marking this issue as stale since it has been open for 14 days with no activity. This issue will be closed if no further activity occurs.

@github-actions github-actions bot added the status:stale Issue/PR will be closed automatically if there's no further activity label Oct 31, 2024
@MarkDaoust
Copy link
Collaborator

At some point the models got much better at doing arithmetic themselves (or they have their own calculator), so maybe that's why they're skipping this function call now.

We should change this example code to use the "turn on the lights" example, since code-execution is also a better way of doing this.

@MarkDaoust MarkDaoust added good first issue Good for newcomers and removed status:awaiting user response Awaiting a response from the author status:stale Issue/PR will be closed automatically if there's no further activity labels Nov 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component:python sdk Issue/PR related to Python SDK good first issue Good for newcomers type:help Support-related issues
Projects
None yet
Development

No branches or pull requests

3 participants