Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logprobs not returning from OpenAI #97

Closed
0xTomDaniel opened this issue Jul 12, 2024 · 2 comments
Closed

Logprobs not returning from OpenAI #97

0xTomDaniel opened this issue Jul 12, 2024 · 2 comments

Comments

@0xTomDaniel
Copy link

Describe the bug
According to this documentation, logprobs is supported. However the response doesn't contain them. In order to ensure that I was sending a correct request, I simply swapped out OpenAI's endpoint and everything worked as expected. All logprobs were returned.

To Reproduce

async def get_openrouter_response(
    messages: list[Message],
    models: list[Model],
    temperature: Temperature | None = None,
    response_format: ResponseFormat | None = None,
    *,
    seed: int | None = None,
    logprobs: bool = False,
    top_logprobs: int | None = None,
) -> OpenRouterResponse:
    """
    Get a response from the OpenRouter API.
    """

    temperature = Temperature(value=0.7) if temperature is None else temperature

    async with httpx.AsyncClient(timeout=5.0) as client:
        response = await client.post(
            url="https://openrouter.ai/api/v1/chat/completions",
            # url="https://api.openai.com/v1/chat/completions",
            headers={
                "Authorization": f"Bearer {SETTINGS.openrouter_api_key}",
                # "Authorization": f"Bearer {SETTINGS.openai_api_key}",
                "HTTP-Referer": SITE_URL,  # Optional, for including your app on openrouter.ai rankings.
                "X-Title": APP_NAME,  # Optional. Shows in rankings on openrouter.ai.
            },
            json={
                # "models": models,
                # "model": "gpt-4o",
                "model": "openai/gpt-4o-2024-05-13",
                "messages": [message.model_dump() for message in messages],
                "temperature": temperature.value,
                "seed": seed,
                "response_format": (
                    None if response_format is None else {"type": response_format}
                ),
                "logprobs": True,
                "top_logprobs": top_logprobs,
            },
        )

Expected behavior
Return logprobs from supported providers and models.

@louisgv
Copy link
Contributor

louisgv commented Jul 14, 2024

@0xTomDaniel thanks for the flag! Just pushed a fix to properly forwarding logprobs. Fix should be up ~10mins after this msg

@0xTomDaniel
Copy link
Author

@0xTomDaniel thanks for the flag! Just pushed a fix to properly forwarding logprobs. Fix should be up ~10mins after this msg

Thanks a bunch!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants