Skip to content

intel-extension-for-pytorch vs. intel-extension-for-transformers #451

Open
@NaamaVian

Description

@NaamaVian

Describe the issue

Hello,

I noticed that there exists intel-extension-for-transformers, though intel-extension-for-pytorch also allows the deployment of LLMs. What is the difference between the two, and which one is best for deploying LLama2-70b?

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions