Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Non-Null Default SamplingPipeline #973

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

martindevans
Copy link
Member

  • Made the SamplingPipeline default to a DefaultSamplingPipeline
  • Removed the deprecated sampler parameters

Should resolve #971

 - Made the `SamplingPipeline` default to a `DefaultSamplingPipeline`
@En3Tho
Copy link

En3Tho commented Nov 5, 2024

To be honest I don't really get the design:
I belive InferenceParams should be just a part of LlamaSharpPromtExecutionSettings so user can just provide all the necessary information.

I believe it's better to remove duplicating properties (like Temperature, etc) from LLamaSharpPromptExecutionSettings and just add InferenceParams there directly. It feels really weird that those things are recreated per request for no apparent reason. Also, not all of properties get translated there. So LLamaSharpPromptExecutionSettings ends up forcing user to use limited settings

This will also make overall usage more uniform: things you use in non-semantic-kernel environment could be transferred in a straightforward way and vise-versa

E.g.
non-sematic kernel

var ex = Executor(...)
var inrerenceParams = ...
ex.Infer(inrerenceParams)

SemanticKernel =

var ex = Executor(...)
var inrerenceParams = ...
var promptExecutionSettings = (..., inrerenceParams)
var competion = (ex, settings)
completion.GetChatMessageContentAsync(promptExecutionSettings)

What do you think?

@martindevans
Copy link
Member Author

I didn't write the original semantic kernel integration so I may be wrong here, but from looking at it I think LlamaSharpPromptExecutionSettings was written in the way it was to keep serialization simple.

What you suggest sounds like a nicer API to use though, and you could certainly adapt the converter (https://github.com/SciSharp/LLamaSharp/blob/master/LLama.SemanticKernel/LLamaSharpPromptExecutionSettingsConverter.cs) to serialize the more complex object.

Nobody is really "in charge" of developing the semantic kernel stuff at the moment, so PRs in this are are very welcome!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG]: System.NullReferenceException in ChatSession.ChatAsync upgrading to 0.18
2 participants