Open
Description
Thank you for adding such a convenient feature!
#490
It would be even more useful if we could configure settings for each agent individually. For example, Perplexity AI allows you to use models like those described here:
https://docs.perplexity.ai/guides/model-cards
It might be helpful to have a configuration like the example below:
local opts = {
debug = false,
model = 'claude-3.5-sonnet', -- default model
agents = { -- agent-specific configurations
perplexityai = {
model = 'llama-3.1-sonar-huge-128k-online', -- agent-specific model
},
},
prompts = prompts,
}
local chat = require('CopilotChat')
chat.setup(opts)