Closed
Description
Prerequisites
- I am running the latest code. Mention the version if possible as well.
- I carefully followed the README.md.
- I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
- I reviewed the Discussions, and have a new and useful enhancement to share.
Feature Description
Please support Falcon Mamba 7B from TII (Technology Innovation Institute TII - UAE)
Motivation
Support for all models is helpful.
My acid test for whether a model will run is to try and make a quant using "gruff my repo".
Admittedly it is hot off the presses yet it ought to run at least in theory, but it doesn't.
Error: Error converting to fp16: b'INFO:hf-to-gguf:Loading model: falcon-mamba-7b\nERROR:hf-to-gguf:Model FalconMambaForCausalLM is not supported\n'
Possible Implementation
They discuss an implementation here: https://falconllm.tii.ae/tii-releases-first-sslm-with-falcon-mamba-7b.html
Any functional mamba or mamba 2 models would be great, but this one is slightly changed.