Skip to content

The minimum GPU resources needed to fine-tune the 32B model? #103

Open
@JaydencoolCC

Description

@JaydencoolCC

Hi!
Could you let me know the minimum GPU resources needed to fine-tune the 32B model?
I trained the 32B model using 8 L40 GPUs, but the process was interrupted during model loading.
It seems like the model has not been loaded into the GPUs.
But it was okay when I trained the 3B model
I want to know whether I can train the 32B model with 8 L40 GPUs (8*40G)
Thanks!

Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions