Skip to content

update for CB #714

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

ArthurZucker
Copy link

@ArthurZucker ArthurZucker commented May 9, 2025

Add necessary changes to call generate with CB
Linked PR: huggingface/transformers#38085
This works:

from lighteval.logging.evaluation_tracker import EvaluationTracker
from lighteval.pipeline import Pipeline, PipelineParameters, ParallelismManager
from lighteval.models.endpoints.inference_providers_model import (
    InferenceProvidersModelConfig,
)
from lighteval.models.transformers.transformers_model import TransformersModel
import torch
from transformers import AutoModelForCausalLM, GenerationConfig

MODEL_NAME = "meta-llama/Meta-Llama-3-8B-Instruct"
PROVIDER = "hf-inference"
BENCHMARKS = "lighteval|gsm8k|0|0"

evaluation_tracker = EvaluationTracker(output_dir="./results")
pipeline_params = PipelineParameters(
    use_chat_template=True, launcher_type=ParallelismManager.NONE, max_samples=None
)

model = AutoModelForCausalLM.from_pretrained(
    "meta-llama/Llama-3.2-3b-Instruct", attn_implementation="sdpa_paged", torch_dtype=torch.bfloat16, device_map="auto"
)

# Configure generation parameters
generation_config = GenerationConfig(
    max_new_tokens=10,
    eos_token_id=model.config.eos_token_id,
    pad_token_id=model.config.pad_token_id,
    num_blocks=2048,
    block_size=256,
)
model.generation_config = generation_config
model = TransformersModel.from_model(model)
pipeline = Pipeline(
    model=model,
    pipeline_parameters=pipeline_params,
    evaluation_tracker=evaluation_tracker,
    tasks=BENCHMARKS,
)

pipeline.evaluate()
results = pipeline.get_results()["results"]
print(results)

@HuggingFaceDocBuilderDev
Copy link
Collaborator

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@ArthurZucker
Copy link
Author

ArthurZucker commented May 9, 2025

{'lighteval:gsm8k:0': defaultdict(<class 'float'>, {'extractive_match': 0.00530705079605762, 'extractive_match_stderr': 0.0020013057209480414}), 'all': {'extractive_match': 0.00530705079605762, 'extractive_match_stderr': 0.0020013057209480414}}

@ArthurZucker
Copy link
Author

I only generated 10 tokens that explains why

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[FT] Continuous batching for transformers
3 participants