feat: Add LLM-friendly guide for AI code assistants (#148)#186
Conversation
There was a problem hiding this comment.
Pull request overview
This PR adds comprehensive LLM-friendly documentation to help AI code assistants (Cursor, Windsurf, GitHub Copilot, Continue) generate accurate AgentScope Java code. The documentation provides structured guidance on core concepts, APIs, patterns, and best practices optimized for LLM consumption.
Key Changes
- Added a 1,171-line LLM guide covering all framework concepts with verified code examples
- Created setup instructions for four major AI IDE platforms with configuration examples
- Updated both English and Chinese README files with quick setup sections
Reviewed changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 5 comments.
| File | Description |
|---|---|
| docs/llm/agentscope-llm-guide.txt | Comprehensive LLM-optimized guide with system message, quick start, 10 core concepts, patterns, best practices, and API reference |
| docs/llm/README.md | Setup instructions for Cursor, Windsurf, GitHub Copilot, and Continue with example workflows and troubleshooting |
| README.md | Added "AI-Powered Development" section with Cursor quick setup and link to detailed guide |
| README_zh.md | Added Chinese "AI 辅助开发" section with Cursor setup and documentation link |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| 2. **Suggest Examples**: Common use cases that should be documented | ||
| 3. **Contribute**: Submit PRs with improvements | ||
|
|
||
| See [CONTRIBUTING.md](../../CONTRIBUTING.md) for guidelines. |
There was a problem hiding this comment.
The link to CONTRIBUTING.md uses a relative path that goes up two directories. Verify that this path is correct from the location of this file (docs/llm/README.md). The correct path should be ../../CONTRIBUTING.md which appears to be what's written, but it's worth confirming this resolves correctly.
docs/llm/agentscope-llm-guide.txt
Outdated
| - **qwen-max**: Complex reasoning, best quality | ||
| - **qwen-turbo**: Fast responses, simple tasks | ||
| - **gpt-4o**: Multi-modal, complex tasks | ||
| - **o1-preview**: Deep reasoning tasks |
There was a problem hiding this comment.
The model name "o1-preview" is listed as a recommended option, but this model may be deprecated or replaced. Consider verifying this is still a valid and current model option, or update to reference "o1" if that's the current version.
| - **o1-preview**: Deep reasoning tasks | |
| - **o1**: Deep reasoning tasks |
docs/llm/agentscope-llm-guide.txt
Outdated
| // Or inject custom objects by type | ||
| // UserContext userCtx = context.get(UserContext.class); |
There was a problem hiding this comment.
The comment states "Or inject custom objects by type" but the code example is commented out without showing the actual usage. Consider either providing a complete working example or clarifying that this is optional functionality with a clearer note.
docs/llm/agentscope-llm-guide.txt
Outdated
| @Tool(description = "Calculate math expression") | ||
| public double calculate(@ToolParam(description = "Expression") String expr) { | ||
| // Parse and calculate | ||
| return result; |
There was a problem hiding this comment.
In the Pattern 2 example, the calculate method returns result but this variable is never declared or computed in the code snippet. This incomplete example could confuse users. Either add the missing implementation or use a simpler placeholder like return 0.0; // Implementation here.
| return result; | |
| return 0.0; // Implementation here |
| ```java | ||
| SessionManager sessionManager = new SessionManager( | ||
| new FileSessionStore("./sessions") | ||
| ); | ||
|
|
||
| // Save after conversation | ||
| sessionManager.saveSession(sessionId, agent).block(); | ||
|
|
||
| // Restore before next conversation | ||
| ReActAgent agent = (ReActAgent) sessionManager | ||
| .loadSession(sessionId, ReActAgent.class) | ||
| .block(); | ||
| ``` |
There was a problem hiding this comment.
The SessionManager usage example shows two different API patterns - one using method chaining with forSessionId() (lines 768-777) and another using constructor instantiation (lines 1009-1019). This inconsistency could confuse users about the correct API to use. Consider standardizing on one pattern or clarifying when each should be used.
- Update model name from 'o1-preview' to 'o1' (2 occurrences) - Add clarification for ToolExecutionContext custom object injection - Fix incomplete calculate() method example with proper placeholder - Standardize SessionManager API usage pattern for consistency
AlbumenJ
left a comment
There was a problem hiding this comment.
Thanks for the PR! I have a few minor suggestions for optimization:
- The
llm-guidefilename should have the.mdextension. - Please include instructions in the
READMEon how to update this guide file. We will need to keep it up-to-date as our features iterate in the future.
Codecov Report✅ All modified and coverable lines are covered by tests. 📢 Thoughts on this report? Let us know! |
- Rename agentscope-llm-guide.txt to agentscope-llm-guide.md - Update all references in README.md, README_zh.md, and docs/llm/README.md - Add maintenance instructions in both English and Chinese READMEs - Instructions cover when and how to update the guide file
ok。It has been fixed. Please check |
README.md
Outdated
| ## 🤖 AI-Powered Development | ||
|
|
||
| AgentScope provides an LLM-friendly guide for AI code assistants like Cursor, Windsurf, and GitHub Copilot. | ||
|
|
||
| **Quick Setup for Cursor:** | ||
|
|
||
| 1. Open Cursor Settings → Features → Docs | ||
| 2. Click "+ Add new Doc" | ||
| 3. Add URL: `https://raw.githubusercontent.com/agentscope-ai/agentscope-java/main/docs/llm/agentscope-llm-guide.md` | ||
|
|
||
| Then use `@docs` in Cursor chat to get context-aware code generation! | ||
|
|
||
| For detailed setup instructions for other AI IDEs, see [docs/llm/README.md](./docs/llm/README.md). | ||
|
|
||
| ### Maintaining the LLM Guide | ||
|
|
||
| If you're a project contributor, please keep the LLM guide (`docs/llm/agentscope-llm-guide.md`) up-to-date when adding new features or modifying APIs. | ||
|
|
||
| For detailed maintenance guidelines and recommended update prompts, see [CONTRIBUTING.md](./CONTRIBUTING.md#d-maintaining-the-llm-guide). | ||
|
|
There was a problem hiding this comment.
Move this part to docs in website. Keep README clean
docs/zh/intro.md
Outdated
|
|
||
| 然后在 Cursor 聊天中使用 `@docs` 来获得上下文感知的代码生成! | ||
|
|
||
| 有关其他 AI IDE 的详细设置说明和最佳实践,请参阅[使用 AI 代码助手配合 AgentScope Java](../llm/README.md)。 |
AgentScope-Java Version
1.0.2
Description
Background
Issue #148 requested an LLM-friendly guide for AI code assistants (Cursor, Windsurf, GitHub Copilot, Continue) to improve code generation accuracy when working with AgentScope Java framework.
Purpose
Provide comprehensive documentation optimized for LLM consumption that helps AI code assistants generate accurate AgentScope Java code by:
Changes Made
Added
/docs/llm/agentscope-llm-guide.txt(1,172 lines)Added
/docs/llm/README.md(356 lines)Updated
README.mdUpdated
README_zh.mdVerification
agentscope-core/src/main/java/How to Test
Checklist
mvn spotless:applymvn test)