Skip to content

[webapp] issue call to backend LLM via @altbot nomenclature #47

@agf2013

Description

@agf2013

when LLM are running in the backend, we need the ability to send queries to the LLM directly via chat.
this task is to track the frontend work necessary to support this feature.
when messages are sent in the chat, they are received and processed by the backend.
therefore the FrontEnd work should be fairly minimal, but should be tracked nonetheless.

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

Describe the solution you'd like
A clear and concise description of what you want to happen.

Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.

Additional context
Add any other context or screenshots about the feature request here.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions