Before starting the supervised finetuned (SFT) models, try out a couple of pre-trained models Options * [BERT](https://huggingface.co/docs/transformers/model_doc/bert) * [DistilBERT](https://huggingface.co/docs/transformers/model_doc/distilbert) * [RoBERTa](https://huggingface.co/docs/transformers/model_doc/roberta) * [LLaMA](https://huggingface.co/docs/transformers/model_doc/llama) * ... Notes: * the size of LLMs * models for text-classification tasks