-
-
Notifications
You must be signed in to change notification settings - Fork 424
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Problem
DefaultInlineCompletionHandler
makes some choices which may be sub-optimal when using a models tuned for code infill tasks.
For example, the reliance on langchain template system here:
model_arguments = self._template_inputs_from_request(request) | |
suggestion = await self.llm_chain.ainvoke(input=model_arguments) | |
suggestion = self._post_process_suggestion(suggestion, request) |
makes it hard to use structured prefix/suffix queries as described in
#669.
Further, the post processing assumptions may be different when using the infill model:
jupyter-ai/packages/jupyter-ai/jupyter_ai/completions/handlers/default.py
Lines 130 to 132 in 4722dc7
def _post_process_suggestion( | |
self, suggestion: str, request: InlineCompletionRequest | |
) -> str: |
Proposed Solution
Either:
- (a) add a traitlet allowing to swap the
DefaultInlineCompletionHandler
to a different class - (b) add an entry point allowing to swap the
DefaultInlineCompletionHandler
to a different class
Any preferences?
Additional context
Chat slash command handlers can be added/(swapped?) by using entry points:
[project.entry-points."jupyter_ai.chat_handlers"]
custom = "custom_package:CustomChatHandler"
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request