Skip to content

attn_bias not aligned & some questions regarding float16 #468

Closed
@MM-IR

Description

@MM-IR

Hi,

  1. When playing with MPT-7b models, I frequently meet the issues of "attn_bias not aligned", with tensor_parallel_size - 2, how do alleviate this issue?

  2. Besides, I just find that your default model loading scripts load float16 versions, for fair evaluation, is it necessary to switch to float32?

Thanks very much in advance!

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions