Skip to content

better error on prompt len exceeded in rllm #81

@mmoskal

Description

@mmoskal

The following gives stop_reason: "failed" but no indication why.

import pyaici.server as aici

async def main():
    await aici.FixedTokens("Some long text that will serve as a prompt for the model to generate more text. This is a test of the AICI server." * 500)
    await aici.gen_text(max_tokens=250, store_var=f"function", stop_at="```")

aici.start(main())

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingrLLM

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions