feat: optimize auto context memory#213
Merged
AlbumenJ merged 4 commits intoagentscope-ai:mainfrom Dec 17, 2025
Merged
Conversation
此次调整涉及配置项中的消息压缩比例以及相关测试方法的名称更新,同时修复了一些冗余代码。 Change-Id: I2bf0fb1279527df843b583d3a79ffd34f67f3ff8 Co-developed-by: Aone Copilot <noreply@alibaba-inc.com>
此修复确保在压缩元数据时正确处理并添加聊天用量信息,避免不必要的令牌消耗记录。 Change-Id: Ie6b1a268b32157bef886508f4681ec701ec4fef9 Co-developed-by: Aone Copilot <noreply@alibaba-inc.com>
Change-Id: Ib73559ff71c68622f0514769ea1cafc4cc8ebb05
Codecov Report❌ Patch coverage is 📢 Thoughts on this report? Let us know! |
更新了自动上下文记忆扩展的README文档,提供了更详细的说明和指导。 Change-Id: I1d023869e55b1f9542f010c9938cb2d35e7c83b0 Co-developed-by: Aone Copilot <noreply@alibaba-inc.com>
AlbumenJ
approved these changes
Dec 17, 2025
JGoP-L
pushed a commit
to JGoP-L/agentscope-java
that referenced
this pull request
Dec 29, 2025
This pull request introduces significant improvements to the AutoContextMemory system, focusing on enhanced configuration, event tracking, and documentation for context compression in LLM-based agent systems. The main changes include adding configurable compression ratios, detailed compression event tracking, and comprehensive documentation for users and developers. **Key changes:** ### 1. Configuration Enhancements * Added a new `currentRoundCompressionRatio` parameter (default 0.3) to `AutoContextConfig`, allowing users to specify the compression ratio for current round messages. This includes builder methods, getters, setters, and integration into the config build process. [[1]](diffhunk://#diff-0f19c9d04469678819a19772e7d5526316006c9e8727f1b7ceb77f967988d6a2R57-R59) [[2]](diffhunk://#diff-0f19c9d04469678819a19772e7d5526316006c9e8727f1b7ceb77f967988d6a2R116-R123) [[3]](diffhunk://#diff-0f19c9d04469678819a19772e7d5526316006c9e8727f1b7ceb77f967988d6a2R156) [[4]](diffhunk://#diff-0f19c9d04469678819a19772e7d5526316006c9e8727f1b7ceb77f967988d6a2R235-R247) [[5]](diffhunk://#diff-0f19c9d04469678819a19772e7d5526316006c9e8727f1b7ceb77f967988d6a2R262) ### 2. Compression Event Tracking * Introduced a `compressionEvents` list in `AutoContextMemory` to record detailed information about each compression operation, including event type, affected message range, tokens used, and timing. This includes serialization support for state persistence. [[1]](diffhunk://#diff-122d812e4a4112a8ba1e8702c54c400ecd624c7e1e717accd6993594347df230R91-R97) [[2]](diffhunk://#diff-122d812e4a4112a8ba1e8702c54c400ecd624c7e1e717accd6993594347df230R122) [[3]](diffhunk://#diff-122d812e4a4112a8ba1e8702c54c400ecd624c7e1e717accd6993594347df230R131-R134) * Implemented the `recordCompressionEvent` method to log each compression event with relevant metadata, such as token counts and time taken, ensuring traceability and facilitating analysis. * Updated the compression logic for current round message summarization and large message summarization to record events with metadata from LLM usage. [[1]](diffhunk://#diff-122d812e4a4112a8ba1e8702c54c400ecd624c7e1e717accd6993594347df230R369-R385) [[2]](diffhunk://#diff-122d812e4a4112a8ba1e8702c54c400ecd624c7e1e717accd6993594347df230R467-R483) ### 3. Documentation Improvements * Added a comprehensive Chinese README (`README.md`) for `AutoContextMemory`, detailing the background, architecture, compression strategies, configuration options, API reference, event tracking, best practices, and usage notes. This documentation provides clear guidance for configuring and using AutoContextMemory effectively. ### 4. Example and Minor Adjustments * Updated example usage in `AutoMemoryExample.java` to reflect new configuration defaults and session IDs, demonstrating the new compression ratio parameter. [[1]](diffhunk://#diff-6b9103fbf3a240744117a2bf207e08f728ae356b07f26db47787eb2283beac2dL64-R64) [[2]](diffhunk://#diff-6b9103fbf3a240744117a2bf207e08f728ae356b07f26db47787eb2283beac2dL86-R86) * Minor import addition to support new features. These changes collectively make AutoContextMemory more configurable, auditable, and user-friendly, supporting advanced context management for LLM agent applications.## AgentScope-Java Version [The version of AgentScope-Java you are working on, e.g. 1.0.2, check your pom.xml dependency version or run `mvn dependency:tree | grep agentscope-parent:pom`(only mac/linux)] ## Description [Please describe the background, purpose, changes made, and how to test this PR] ## Checklist Please check the following items before code is ready to be reviewed. - [ ] Code has been formatted with `mvn spotless:apply` - [ ] All tests are passing (`mvn test`) - [ ] Javadoc comments are complete and follow project conventions - [ ] Related documentation has been updated (e.g. links, examples, etc.) - [ ] Code is ready for review
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This pull request introduces significant improvements to the AutoContextMemory system, focusing on enhanced configuration, event tracking, and documentation for context compression in LLM-based agent systems. The main changes include adding configurable compression ratios, detailed compression event tracking, and comprehensive documentation for users and developers.
Key changes:
1. Configuration Enhancements
currentRoundCompressionRatioparameter (default 0.3) toAutoContextConfig, allowing users to specify the compression ratio for current round messages. This includes builder methods, getters, setters, and integration into the config build process. [1] [2] [3] [4] [5]2. Compression Event Tracking
compressionEventslist inAutoContextMemoryto record detailed information about each compression operation, including event type, affected message range, tokens used, and timing. This includes serialization support for state persistence. [1] [2] [3]recordCompressionEventmethod to log each compression event with relevant metadata, such as token counts and time taken, ensuring traceability and facilitating analysis.3. Documentation Improvements
README.md) forAutoContextMemory, detailing the background, architecture, compression strategies, configuration options, API reference, event tracking, best practices, and usage notes. This documentation provides clear guidance for configuring and using AutoContextMemory effectively.4. Example and Minor Adjustments
AutoMemoryExample.javato reflect new configuration defaults and session IDs, demonstrating the new compression ratio parameter. [1] [2]These changes collectively make AutoContextMemory more configurable, auditable, and user-friendly, supporting advanced context management for LLM agent applications.## AgentScope-Java Version
[The version of AgentScope-Java you are working on, e.g. 1.0.2, check your pom.xml dependency version or run
mvn dependency:tree | grep agentscope-parent:pom(only mac/linux)]Description
[Please describe the background, purpose, changes made, and how to test this PR]
Checklist
Please check the following items before code is ready to be reviewed.
mvn spotless:applymvn test)