Skip to content

[Feature Request]: Add support for Mixtral-8x7B-Instruct-v0.1-q4 #77

@mchaliadzinau

Description

@mchaliadzinau

Problem Description

Could you, please, add following models to webllm chat?

Solution Description

From my understanding the missing part is Custom model library right?

I can try to do it myself and prepare PR if it is feasible for someone with no understanding of the field, unfortunately I'm not sure if it is possible to do using Intel Macbook.

Alternatives Considered

It seems like WebLLM(WebGPU) is single viable option to run it on Intel Mac (since mlc-ai is not an option: mlc-ai/mlc-llm#3078).

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions