Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider adding more parameters to the config file #16

Open
AndreaPi opened this issue Apr 24, 2023 · 2 comments
Open

Consider adding more parameters to the config file #16

AndreaPi opened this issue Apr 24, 2023 · 2 comments
Labels
enhancement New feature or request

Comments

@AndreaPi
Copy link

Hi,

I like your minimalistic approach a lot! But the lack of a few configurable parameters made me switch to https://github.com/j178/chatgpt. If you could add the following parameters to the yaml file:

  "prompts": {
    "default": "You are a helpful assistant"
     "pirate": "You are pirate Blackbeard. Arr matey!"},
  "conversation": {
    "prompt": "default",
    "stream": true,
    "max_tokens": 1024,
    "temperature": 0
  }

I would be happy to switch back! Basically, this is adding the following functionalities:

  1. the possibility to write down one or more contexts in the yaml file. This is a bit more convenient than having to carry around a separate file for each context, and pass them via --context <FILE PATH>
  2. stream al1lows the tokens to be sent as they become available, instead than all at once at the end of the reply. This makes quite the difference with long responses and slower models such as GPT-4
  3. max_tokens is self-explanatory 🙂 and it also makes quite the difference when using GPT-4.
  4. temperature set to 0 allows deterministic responses (fundamental for reproducibility. From 0< to 2, it allows increasingly more creative but also less focused.

These are very simple modifications, you just need to read them from the yaml file and add them as extra parameters when posting the request. Thanks!

@marcolardera marcolardera added the enhancement New feature or request label Apr 25, 2023
@marcolardera
Copy link
Owner

marcolardera commented Apr 25, 2023

Hi, thank you for your feedback! I think these are all very useful enhancements.

I just implemented 3 and 4, temperature and max_tokens parameters, with the last commit ( 362bade ).

1 is easy, I will work on it as soon as I have a bit of time.

2 also seems a cool feature but I need to study a bit how to render the streaming response in the console.

@AndreaPi
Copy link
Author

Great! Looking forward to the implementation of 1 and 2. Regarding this last one, I understand it's a bit more complicated, but it would really enhance usability a lot. For what it concerns rendering, since you use rich (good choice 👍) this could help

https://rich.readthedocs.io/en/stable/live.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants