Skip to content
This repository has been archived by the owner on Jun 21, 2024. It is now read-only.
This repository has been archived by the owner on Jun 21, 2024. It is now read-only.

SteamSHP #15

Open
Open
@conceptofmind

Description

def SteamSHP(input_query: str):
    device = "cuda"  # if you have a GPU
    tokenizer = AutoTokenizer.from_pretrained("stanfordnlp/SteamSHP-flan-t5-large")
    model = T5ForConditionalGeneration.from_pretrained(
        "stanfordnlp/SteamSHP-flan-t5-large"
    ).to(device)
    x = tokenizer([input_query], return_tensors="pt").input_ids.to(device)
    y = model.generate(x, max_new_tokens=1)
    output = tokenizer.batch_decode(y, skip_special_tokens=True)
    return output

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions