Skip to content

feat: create a simple chat interface for all supported providers#31

Open
billybonks wants to merge 1 commit intomainfrom
feat/simple_chat
Open

feat: create a simple chat interface for all supported providers#31
billybonks wants to merge 1 commit intomainfrom
feat/simple_chat

Conversation

@billybonks
Copy link
Contributor

this helps us easily test with different providers.

this helps us easily test with different providers.
Copy link

@managerbot-app managerbot-app bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewed Changes

Managerbot reviewed 5 out of 9 changed files in this pull request and generated 11 comments.

Files not reviewed (4)
  • Gemfile
  • Gemfile.lock
  • sample/simple_chat/.env-sample
  • sample/simple_chat/README.md
Comments suppressed due to low confidence (0)


begin
response = LlmGateway::Client.chat(@model, @conversation_history, api_key: @api_key)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider handling API response structure variations more robustly. The current nested if-else structure to extract content is fragile and may break with API changes.

else
"No response received from the model"
end
rescue => e

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The generic rescue clause will catch all exceptions. Consider rescuing specific exceptions to avoid masking unexpected errors.

def get_api_key_for_model(model)
if model.include?('claude')
ENV['ANTHROPIC_API_KEY']
elsif model.include?('llama') || model.include?('meta-llama') || model.include?('openai/gpt-oss')

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's duplicate logic between this method and the load_configuration method in SimpleChatRunner. Consider extracting this to a shared utility.

@@ -0,0 +1,56 @@
require 'dotenv/load'
require_relative 'simple_chat'
require 'debug'

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The 'debug' gem is required but not used in the code. Consider removing it if not needed.

def load_configuration
model = ENV['DEFAULT_MODEL'] || 'claude-opus-4-20250514'

api_key = if model.include?('claude')

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This API key detection logic is duplicated in chat_client.rb. Consider refactoring to avoid duplication.

handle_down_key
when 10, 13 # Enter key
handle_enter_key
when Curses::KEY_BACKSPACE, 127, 8

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Backspace handling should account for different backspace key codes (8, 127, KEY_BACKSPACE) but the implementation only removes one character regardless of which key was pressed.

@messages << { role: 'user', content: message }

begin
response = @client.send_message(message)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider showing a loading indicator while waiting for the API response, as LLM responses can take several seconds to return.

def draw_messages(y, x, height, width)
return if @messages.empty?

visible_messages = @messages.last([ height, @messages.length ].min)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This implementation only shows the most recent messages that fit on screen. Consider implementing proper scrolling to view message history.

prefix = "Error: "
end

content = (prefix + message[:content]).slice(0, width)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Long messages will be truncated without any indication. Consider implementing word wrapping or a scrolling mechanism for long messages.

Curses.setpos(y + 2 + i, x)
if i == @dropdown_index
Curses.attron(Curses.color_pair(3)) # Highlight
end

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The dropdown only shows the first 10 models regardless of which 10 are visible. Consider implementing proper scrolling in the dropdown menu when there are more items than can be displayed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant