Skip to content

feat: optimize auto context memory#213

Merged
AlbumenJ merged 4 commits intoagentscope-ai:mainfrom
shiyiyue1102:main-optimize-autocontext
Dec 17, 2025
Merged

feat: optimize auto context memory#213
AlbumenJ merged 4 commits intoagentscope-ai:mainfrom
shiyiyue1102:main-optimize-autocontext

Conversation

@shiyiyue1102
Copy link
Contributor

This pull request introduces significant improvements to the AutoContextMemory system, focusing on enhanced configuration, event tracking, and documentation for context compression in LLM-based agent systems. The main changes include adding configurable compression ratios, detailed compression event tracking, and comprehensive documentation for users and developers.

Key changes:

1. Configuration Enhancements

  • Added a new currentRoundCompressionRatio parameter (default 0.3) to AutoContextConfig, allowing users to specify the compression ratio for current round messages. This includes builder methods, getters, setters, and integration into the config build process. [1] [2] [3] [4] [5]

2. Compression Event Tracking

  • Introduced a compressionEvents list in AutoContextMemory to record detailed information about each compression operation, including event type, affected message range, tokens used, and timing. This includes serialization support for state persistence. [1] [2] [3]
  • Implemented the recordCompressionEvent method to log each compression event with relevant metadata, such as token counts and time taken, ensuring traceability and facilitating analysis.
  • Updated the compression logic for current round message summarization and large message summarization to record events with metadata from LLM usage. [1] [2]

3. Documentation Improvements

  • Added a comprehensive Chinese README (README.md) for AutoContextMemory, detailing the background, architecture, compression strategies, configuration options, API reference, event tracking, best practices, and usage notes. This documentation provides clear guidance for configuring and using AutoContextMemory effectively.

4. Example and Minor Adjustments

  • Updated example usage in AutoMemoryExample.java to reflect new configuration defaults and session IDs, demonstrating the new compression ratio parameter. [1] [2]
  • Minor import addition to support new features.

These changes collectively make AutoContextMemory more configurable, auditable, and user-friendly, supporting advanced context management for LLM agent applications.## AgentScope-Java Version

[The version of AgentScope-Java you are working on, e.g. 1.0.2, check your pom.xml dependency version or run mvn dependency:tree | grep agentscope-parent:pom(only mac/linux)]

Description

[Please describe the background, purpose, changes made, and how to test this PR]

Checklist

Please check the following items before code is ready to be reviewed.

  • Code has been formatted with mvn spotless:apply
  • All tests are passing (mvn test)
  • Javadoc comments are complete and follow project conventions
  • Related documentation has been updated (e.g. links, examples, etc.)
  • Code is ready for review

此次调整涉及配置项中的消息压缩比例以及相关测试方法的名称更新,同时修复了一些冗余代码。

Change-Id: I2bf0fb1279527df843b583d3a79ffd34f67f3ff8
Co-developed-by: Aone Copilot <noreply@alibaba-inc.com>
此修复确保在压缩元数据时正确处理并添加聊天用量信息,避免不必要的令牌消耗记录。

Change-Id: Ie6b1a268b32157bef886508f4681ec701ec4fef9
Co-developed-by: Aone Copilot <noreply@alibaba-inc.com>
Change-Id: Ib73559ff71c68622f0514769ea1cafc4cc8ebb05
@shiyiyue1102 shiyiyue1102 requested a review from a team December 16, 2025 10:56
@shiyiyue1102 shiyiyue1102 changed the title Main optimize autocontext feat: optimize auto context memory Dec 16, 2025
@codecov
Copy link

codecov bot commented Dec 16, 2025

更新了自动上下文记忆扩展的README文档,提供了更详细的说明和指导。

Change-Id: I1d023869e55b1f9542f010c9938cb2d35e7c83b0
Co-developed-by: Aone Copilot <noreply@alibaba-inc.com>
@AlbumenJ AlbumenJ merged commit 10b784e into agentscope-ai:main Dec 17, 2025
4 checks passed
@shiyiyue1102 shiyiyue1102 deleted the main-optimize-autocontext branch December 26, 2025 09:17
JGoP-L pushed a commit to JGoP-L/agentscope-java that referenced this pull request Dec 29, 2025
This pull request introduces significant improvements to the
AutoContextMemory system, focusing on enhanced configuration, event
tracking, and documentation for context compression in LLM-based agent
systems. The main changes include adding configurable compression
ratios, detailed compression event tracking, and comprehensive
documentation for users and developers.

**Key changes:**

### 1. Configuration Enhancements

* Added a new `currentRoundCompressionRatio` parameter (default 0.3) to
`AutoContextConfig`, allowing users to specify the compression ratio for
current round messages. This includes builder methods, getters, setters,
and integration into the config build process.
[[1]](diffhunk://#diff-0f19c9d04469678819a19772e7d5526316006c9e8727f1b7ceb77f967988d6a2R57-R59)
[[2]](diffhunk://#diff-0f19c9d04469678819a19772e7d5526316006c9e8727f1b7ceb77f967988d6a2R116-R123)
[[3]](diffhunk://#diff-0f19c9d04469678819a19772e7d5526316006c9e8727f1b7ceb77f967988d6a2R156)
[[4]](diffhunk://#diff-0f19c9d04469678819a19772e7d5526316006c9e8727f1b7ceb77f967988d6a2R235-R247)
[[5]](diffhunk://#diff-0f19c9d04469678819a19772e7d5526316006c9e8727f1b7ceb77f967988d6a2R262)

### 2. Compression Event Tracking

* Introduced a `compressionEvents` list in `AutoContextMemory` to record
detailed information about each compression operation, including event
type, affected message range, tokens used, and timing. This includes
serialization support for state persistence.
[[1]](diffhunk://#diff-122d812e4a4112a8ba1e8702c54c400ecd624c7e1e717accd6993594347df230R91-R97)
[[2]](diffhunk://#diff-122d812e4a4112a8ba1e8702c54c400ecd624c7e1e717accd6993594347df230R122)
[[3]](diffhunk://#diff-122d812e4a4112a8ba1e8702c54c400ecd624c7e1e717accd6993594347df230R131-R134)
* Implemented the `recordCompressionEvent` method to log each
compression event with relevant metadata, such as token counts and time
taken, ensuring traceability and facilitating analysis.
* Updated the compression logic for current round message summarization
and large message summarization to record events with metadata from LLM
usage.
[[1]](diffhunk://#diff-122d812e4a4112a8ba1e8702c54c400ecd624c7e1e717accd6993594347df230R369-R385)
[[2]](diffhunk://#diff-122d812e4a4112a8ba1e8702c54c400ecd624c7e1e717accd6993594347df230R467-R483)

### 3. Documentation Improvements

* Added a comprehensive Chinese README (`README.md`) for
`AutoContextMemory`, detailing the background, architecture, compression
strategies, configuration options, API reference, event tracking, best
practices, and usage notes. This documentation provides clear guidance
for configuring and using AutoContextMemory effectively.

### 4. Example and Minor Adjustments

* Updated example usage in `AutoMemoryExample.java` to reflect new
configuration defaults and session IDs, demonstrating the new
compression ratio parameter.
[[1]](diffhunk://#diff-6b9103fbf3a240744117a2bf207e08f728ae356b07f26db47787eb2283beac2dL64-R64)
[[2]](diffhunk://#diff-6b9103fbf3a240744117a2bf207e08f728ae356b07f26db47787eb2283beac2dL86-R86)
* Minor import addition to support new features.

These changes collectively make AutoContextMemory more configurable,
auditable, and user-friendly, supporting advanced context management for
LLM agent applications.## AgentScope-Java Version

[The version of AgentScope-Java you are working on, e.g. 1.0.2, check
your pom.xml dependency version or run `mvn dependency:tree | grep
agentscope-parent:pom`(only mac/linux)]

## Description

[Please describe the background, purpose, changes made, and how to test
this PR]

## Checklist

Please check the following items before code is ready to be reviewed.

- [ ]  Code has been formatted with `mvn spotless:apply`
- [ ]  All tests are passing (`mvn test`)
- [ ]  Javadoc comments are complete and follow project conventions
- [ ] Related documentation has been updated (e.g. links, examples,
etc.)
- [ ]  Code is ready for review
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants