Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add CompressibleAgent #443

Merged
merged 77 commits into from
Nov 10, 2023
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
77 commits
Select commit Hold shift + click to select a range
d6761d1
api_base -> base_url (#383)
sonichi Oct 23, 2023
dfd5695
InvalidRequestError -> BadRequestError (#389)
sonichi Oct 23, 2023
c41be9c
remove api_key_path; close #388
sonichi Oct 23, 2023
c8f8cbc
Merge branch 'main' into dev/v0.2
sonichi Oct 24, 2023
2f97b8b
close #402 (#403)
sonichi Oct 24, 2023
1df493b
openai client (#419)
sonichi Oct 25, 2023
23a107a
Merge branch 'main' into dev/v0.2
sonichi Oct 25, 2023
d77b1c9
_client -> client
sonichi Oct 25, 2023
6a8eaf3
_client -> client
sonichi Oct 25, 2023
c3f58f3
extra kwargs
sonichi Oct 25, 2023
75a6f7d
Completion -> client (#426)
sonichi Oct 26, 2023
9b25c91
annotations
sonichi Oct 26, 2023
8c1626c
import
sonichi Oct 26, 2023
8d42528
reduce test
sonichi Oct 26, 2023
b8302a7
skip test
sonichi Oct 26, 2023
b09e6bb
skip test
sonichi Oct 26, 2023
f29fbc5
skip test
sonichi Oct 26, 2023
4318e0e
debug test
sonichi Oct 26, 2023
153f182
rename test
sonichi Oct 26, 2023
645d60e
update workflow
sonichi Oct 26, 2023
62eabc8
update workflow
sonichi Oct 26, 2023
f895633
env
sonichi Oct 26, 2023
a72c89d
py version
sonichi Oct 26, 2023
9073eb7
doc improvement
sonichi Oct 26, 2023
b0ad39b
docstr update
sonichi Oct 26, 2023
33de6a3
openai<1
sonichi Oct 26, 2023
fd2ba51
add compressibleagent
yiranwu0 Oct 26, 2023
0d31c15
Merge remote-tracking branch 'origin/main' into compressible
yiranwu0 Oct 27, 2023
af6d197
revise doc, add tests, add example
yiranwu0 Oct 27, 2023
d6d7d87
fix link
yiranwu0 Oct 27, 2023
5557ef9
fix link
yiranwu0 Oct 27, 2023
fd32694
fix link
yiranwu0 Oct 27, 2023
269c69a
Merge remote-tracking branch 'origin/main' into compressible
yiranwu0 Oct 27, 2023
b3ca4b4
remove test
yiranwu0 Oct 27, 2023
3e70382
update doc
yiranwu0 Oct 27, 2023
962c635
Merge branch 'main' into compressible
yiranwu0 Oct 27, 2023
f8c3e38
update doc
yiranwu0 Oct 27, 2023
b4ad91c
Merge branch 'compressible' of github.com:microsoft/autogen into comp…
yiranwu0 Oct 27, 2023
b2e7f9c
Merge branch 'main' into dev/v0.2
sonichi Oct 28, 2023
3b567c9
add tiktoken to dependency
sonichi Oct 28, 2023
3cb3930
filter_func
sonichi Oct 28, 2023
cc9f0e0
async test
sonichi Oct 29, 2023
62b0393
Merge branch 'main' into dev/v0.2
sonichi Oct 29, 2023
82f3712
dependency
sonichi Oct 29, 2023
24a74fd
revision
yiranwu0 Oct 29, 2023
420cbf0
migration guide (#477)
sonichi Oct 30, 2023
fa6fbec
Merge branch 'main' into dev/v0.2
sonichi Oct 30, 2023
e01c8ff
Merge branch 'main' into compressible
yiranwu0 Oct 30, 2023
ed1b77b
Merge branch 'main' into dev/v0.2
sonichi Oct 30, 2023
5d49694
Merge branch 'main' into compressible
yiranwu0 Oct 30, 2023
23475b2
Merge branch 'dev/v0.2' into compressible
yiranwu0 Oct 30, 2023
326b39a
update for dev
yiranwu0 Oct 30, 2023
94fe6d7
revision
yiranwu0 Oct 31, 2023
73d202e
revision
yiranwu0 Nov 3, 2023
1d7a069
allow not compressing last n msgs
yiranwu0 Nov 3, 2023
d74562b
update
yiranwu0 Nov 4, 2023
c59271a
Merge remote-tracking branch 'origin/main' into compressible
yiranwu0 Nov 4, 2023
a7a024e
correct merge
yiranwu0 Nov 4, 2023
6938716
update test workflow
yiranwu0 Nov 4, 2023
126d090
check test
yiranwu0 Nov 4, 2023
9291e7a
update for test
yiranwu0 Nov 4, 2023
78e3683
update
yiranwu0 Nov 5, 2023
b6222f5
Merge branch 'main' into compressible
yiranwu0 Nov 5, 2023
a40efb8
update notebook
yiranwu0 Nov 5, 2023
4215159
update
yiranwu0 Nov 5, 2023
1b81c33
Merge remote-tracking branch 'origin/main' into compressible
yiranwu0 Nov 5, 2023
90ba33c
fix bug
yiranwu0 Nov 6, 2023
6deee3f
update
yiranwu0 Nov 6, 2023
921c3dc
update
yiranwu0 Nov 6, 2023
9a3eac0
Merge branch 'main' into compressible
yiranwu0 Nov 7, 2023
a8f3bce
Merge branch 'main' into compressible
sonichi Nov 7, 2023
b1e5b80
update
yiranwu0 Nov 8, 2023
8614b54
Merge branch 'main' into compressible
yiranwu0 Nov 8, 2023
114a3da
Merge branch 'main' into compressible
yiranwu0 Nov 9, 2023
de770be
check to "pull_request_target" in contrib-openai
yiranwu0 Nov 9, 2023
ed21924
Merge branch 'compressible' of github.com:microsoft/autogen into comp…
yiranwu0 Nov 9, 2023
478fe41
Merge remote-tracking branch 'origin/main' into compressible
yiranwu0 Nov 9, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
update for dev
  • Loading branch information
yiranwu0 committed Oct 30, 2023
commit 326b39a2bc702c3b114bcd338c5ceba33506e4dc
28 changes: 17 additions & 11 deletions autogen/agentchat/contrib/compressible_agent.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
from typing import Callable, Dict, Optional, Union, Tuple, List, Any
from autogen import oai
from autogen import OpenAIWrapper
from autogen import Agent, ConversableAgent
import copy
import asyncio
Expand Down Expand Up @@ -68,7 +68,7 @@ def __init__(
system_message (str): system message for the ChatCompletion inference.
Please override this attribute if you want to reprogram the agent.
llm_config (dict): llm inference configuration.
Please refer to [Completion.create](/docs/reference/oai/completion#create)
Please refer to [OpenAIWrapper.create](/docs/reference/oai/client#create)
for available options.
is_termination_msg (function): a function that takes a message in the form of a dictionary
and returns a boolean value indicating if this received message is a termination message.
Expand Down Expand Up @@ -107,6 +107,17 @@ def __init__(

self._set_compress_config(compress_config)

# create a separate client for compression.
if llm_config is False:
self.llm_compress_config = False
self.compress_client = None
else:
self.llm_compress_config = self.llm_config.copy()
# remove functions
if "functions" in self.llm_compress_config:
del self.llm_compress_config["functions"]
self.compress_client = OpenAIWrapper(**self.llm_compress_config)

self._reply_func_list.clear()
self.register_reply([Agent, None], ConversableAgent.generate_oai_reply)
self.register_reply([Agent], CompressibleAgent.on_oai_token_limit) # check token limit
Expand Down Expand Up @@ -283,12 +294,8 @@ def compress_messages(

TODO: model used in compression agent is different from assistant agent: For example, if original model used by is gpt-4; we start compressing at 70% of usage, 70% of 8092 = 5664; and we use gpt 3.5 here max_toke = 4096, it will raise error. choosinng model automatically?
"""
# 1. use the same config for compression and remove functions from llm_config
llm_config = copy.deepcopy(self.llm_config) if config is None else copy.deepcopy(config)
if llm_config is False or messages is None:
return False, None
if "functions" in llm_config:
del llm_config["functions"]
# 1. use the compression client
client = self.compress_client if config is None else config

# 2. stop if there is only one message in the list
if len(messages) <= 1:
Expand Down Expand Up @@ -342,16 +349,15 @@ def compress_messages(
4. Context after ##FUNCTION_RETURN## (or code return): Keep the exact content if it is short. Summarize/compress if it is too long, you should note what the function has achieved and what the return value is.
"""
try:
response = oai.ChatCompletion.create(
response = client.create(
context=None,
messages=[{"role": "system", "content": compress_sys_msg}] + chat_to_compress,
**llm_config,
)
except Exception as e:
print(colored(f"Failed to compress the content due to {e}", "red"), flush=True)
return False, None

compressed_message = oai.ChatCompletion.extract_text_or_function_call(response)[0]
compressed_message = self.client.extract_text_or_function_call(response)[0]
assert isinstance(compressed_message, str), f"compressed_message should be a string: {compressed_message}"
if self.compress_config["verbose"]:
print(
Expand Down
Loading