Skip to content

Conversation

JamesHWade
Copy link
Collaborator

No description provided.

@JamesHWade
Copy link
Collaborator Author

@calderonsamuel here is what I have so far. It's not working yet, but I hope to have something workable in a few days.

@calderonsamuel
Copy link
Collaborator

I see. At first glance, I have two comments:

  1. I understand this is not inmediately clear, but there is no need to have different UIs for stream and no-stream. If we don't pass any arguments to StreamHandler$new() the app will just wait for the full response and rendered it once it arrives, behaving exactly as if the stream is not handled at all. renderStreamingMessage() is only displayed when a StreamHandler() is initialized with both the shiny session and the user prompt.
  2. I agree that having S3 methods is a good step for extendability infrastructure, however we should agree on some naming conventions (request_base_hf(), query_huggingface_api (), hf_create_completion() are all different!). I was even thinking on something like base_request(api = "huggingface") with classes c("huggingface", "llmrequest", "httr2_request") or something similar.

R/mod_chat.R Outdated
Comment on lines 63 to 64
session = session,
user_prompt = prompt$input_prompt
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removing these two lines removes rendering of the stream

@JamesHWade
Copy link
Collaborator Author

Thank you for clarifying. I clearly need to read your blog post in more detail 😊.

Standardizing our naming convention is a must. Good idea! Maybe a good discussion thread topic?

@calderonsamuel
Copy link
Collaborator

Thank you for clarifying. I clearly need to read your blog post in more detail 😊.

Standardizing our naming convention is a must. Good idea! Maybe a good discussion thread topic?

Great! I've been thinking about this for the past few days. I'll open a thread about this by the end of the day.

@calderonsamuel
Copy link
Collaborator

Sorry for the delay, the discussion is in #111 . Please check it out @JamesHWade , @MichelNivard and provide feedback if possible 🙏

@JamesHWade JamesHWade marked this pull request as ready for review June 25, 2023 22:12
@JamesHWade
Copy link
Collaborator Author

@calderonsamuel, please take a look before I merge. There's a lot in here, and I welcome your edits and suggestions.

@calderonsamuel
Copy link
Collaborator

Sure! I'm having some free time on friday, and I see lots of changes! I'll try to have some benchmarks on the streaming part too cuz at first glance it seems a bit slower than before

@JamesHWade
Copy link
Collaborator Author

@calderonsamuel - Checking in to ask if you've had a chance to review the PR. I'm open to reverting changes these changes and going back to an R6 approach for later releases.

@calderonsamuel
Copy link
Collaborator

I apologize for keeping you waiting. I had anticipated having more free time, but it didn't turn out that way. I agree that it's preferable to merge these changes and then assess what can be later improved. Please just don't forget to add the changelog in the News file

@JamesHWade
Copy link
Collaborator Author

No need to apologize at all. I’ll update and the news and merge. Thank you for the guidance and suggestions building out these classes!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants