Skip to content

LLM integration #2073

Open
Open
@josevalim

Description

@josevalim

We want to integrate LLMs as part of Livebook itself. There are at least four distinct levels this can happen:

  1. Code completion (may or may not need a LLM) (options: Codeium, Copilot, fine-tuned Repl.it model)

  2. Chat-based (which also includes selecting code and asking to document it as well as explaining exceptions) (options: Codeium, Copilot, OpenAI, Claude)

  3. Semantic-search over all installed packages in the runtime (may or may not need a LLM) (options: Codeium, Copilot, OpenAI, Claude)

  4. Function completion (we can allow kino/smart-cells to register functions which we hook into prompts, similar to HF Agents) (options: OpenAI)

This is a meta-issue and we are currently doing proof of concepts on different areas. There is no clear decision yet. We will most likely allow users to "bring their own LLM" for most of these categories, especially 2-3-4 (either commercial or self-hosted).

Metadata

Metadata

Assignees

No one assigned

    Labels

    area:backendRelated to the backendarea:frontendRelated to UI/UXdiscussionNeeds to be discussed before moving forwardfeatureNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions