Closed
Description
My code:
llm_model = LLM(model=model_path)
...
...
sampling_params = SamplingParams(
temperature=temperature, top_p=top_p, top_k=top_k, max_tokens=max_new_tokens
,frequency_penalty=repetition_penalty,ignore_eos=True
)
outputs = llm_model.generate([prompt], sampling_params)
My parameters:
{
"Temperature":0.01,
"Top-k":70,
"Top-p":0.85,
"Max new tokens":4096,
"Repetition Penalty":1.2
}
Does anyone know where the problem is?