-
Notifications
You must be signed in to change notification settings - Fork 1
Added AI Chat Mode #3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Also removed -ai -ask -followup combo requirements in leu of using -chat. This launches a BubbleTea TUI Chat Session that has an LLM and Prompt preconfigured with the context of the new summary it just created. The chat log is then saved to the kOutputDir destination. Example usage includes: summarize -i "go,sh" -chat
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR implements AI Chat Mode functionality for the summarize tool, allowing users to interact with an LLM-powered chat interface loaded with the context of their workspace summary. The implementation adds a -chat flag that opens a BubbleTea TUI chat session and integrates with various AI providers including Ollama.
Key Changes:
- Added AI chat functionality with configurable providers, models, and settings
- Implemented BubbleTea TUI for interactive chat interface
- Refactored codebase by extracting functionality into separate modules for better organization
Reviewed Changes
Copilot reviewed 16 out of 17 changed files in this pull request and generated 9 comments.
Show a summary per file
| File | Description |
|---|---|
| main.go | Major refactoring to extract functions and integrate chat functionality |
| chat.go | New BubbleTea TUI implementation for AI chat interface |
| ai.go | AI provider configuration and initialization logic |
| configure.go | Extracted configuration setup with new AI-related settings |
| const.go | Centralized constants including AI configuration keys |
| var.go, var_funcs.go | Extracted variable definitions and helper functions |
| type.go, type_funcs.go | Extracted type definitions and methods |
| version.go | Extracted version handling logic |
| env.go | Environment variable handling utilities |
| reflect.go | File reflection utilities for chat logs |
| simplify.go | String deduplication utility |
| gz.go | Compression/decompression utilities |
| go.mod | Added dependencies for BubbleTea UI and AI integration |
| VERSION | Version bump to v1.1.0 |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
Now with the
-chatoption, you can open a BubbleTea TUI chat session that is loaded with a context of your summarized workspace that is about to be saved to disk or printed to screen, depending on your other options you tag onto with-chat. Example usage is:summarize -i "go,mod" -chatThis allows you to interact with your LLM of choice in order to chat with your summarized workspace and an LLM agent of your choice. Tested with Ollama.