Skip to content

Access LogProbs in ValidationOutcome #1272

Open
@abhishek9sharma

Description

@abhishek9sharma

Description
Support for accessing logprobs in ValidationOutcome.
I am currently using something similar to this https://github.com/guardrails-ai/guardrails-api/blob/main/guardrails_api/utils/openai.py#L4 in order to stream back the validation outcome to a downstream client in open ai format.

However I am having trouble with accessing the logprob information in validation outcome. Is there a way to do it.

Why is this needed
Some downstream clients need logprobs for each token for some logic at their end. Also in general as guardrails-ai has to be used as a proxy/wrapper across open ai endpoints, clients downstream may expect everything generated by open-ai compliant APIs to be passed down to the after validation.

End result
All the info guard-rails fetches from a backend LLM API can be passed down with ValidationOutcome

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions