Finetuning Mistral 7B on google colab
-
Updated
Feb 20, 2024 - Jupyter Notebook
Finetuning Mistral 7B on google colab
LocalPrompt is an AI-powered tool designed to refine and optimize AI prompts, helping users run locally hosted AI models like Mistral-7B for privacy and efficiency. Ideal for developers seeking to run LLMs locally without external APIs.
LocalPrompt is an AI-powered tool designed to refine and optimize AI prompts, helping users run locally hosted AI models like Mistral-7B for privacy and efficiency. Ideal for developers seeking to run LLMs locally without external APIs.
This repository provides a merged LoRA fine-tuned version of Mistral-7B, specialized for health insurance Q&A and domain-specific chat tasks. The LoRA weights are already integrated into the base model, making it usable as a standalone model for instruction-following and conversational use cases.
Add a description, image, and links to the mistral7b topic page so that developers can more easily learn about it.
To associate your repository with the mistral7b topic, visit your repo's landing page and select "manage topics."