Skip to content

Conversation

mapmeld
Copy link
Contributor

@mapmeld mapmeld commented Sep 21, 2024

Summary

This PR adds support for Mamba models (by default mamba-2.8b) which are an alternative to attention-based LLMs.

Changes include:

  • adding Mamba to the models/ directory, MambaEngine to the engines/ directory, and updates to config files
  • upgrading the Transformers library dependency to v4.39, the earliest including MambaForCausalLM

Notes:

  • if upgrading transformers should be avoided, the code to implement the Mamba model could be included in the models/ directory
  • Mamba does support PEFT / adapters, so this could be added
  • Could default to a smaller Mamba model, or selection of Mamba model sizes

Checklist

Demonstrated inference in this notebook: https://colab.research.google.com/drive/1-i4xmsyppWBdwR1qt6QN21m0uiXu0guM?usp=sharing

  • Mamba is loadable with model = BaseModel.create("mamba")
  • Model generates text with model.generate(...)

- load via MambaForCausalLM
- upgrade Transformers
- add mamba to yamls
@MarcosRiveraMartinez MarcosRiveraMartinez merged commit 570a0d6 into stochasticai:main Sep 23, 2024
1 check passed
@MarcosRiveraMartinez
Copy link
Contributor

@mapmeld Thanks for your contribution!

@mapmeld mapmeld deleted the mamba branch September 23, 2024 17:44
glennko pushed a commit that referenced this pull request Sep 28, 2025
Add Mamba to available LLMs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants