Skip to content

Add Jupyter notebook example for Bitnet.cpp #81

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
207 changes: 207 additions & 0 deletions Bitnet_Jupyter_example.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,207 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"\n",
"\n",
"<span style=\"font-size: 35px;\">Setting Up and Running BitNet On a Notebook: A Step-by-Step</span>\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<span style=\"font-size: 24px;color:brown\">Go to the terminal and follow these steps:</span>\n",
"\n",
"\n",
"\n",
"1. **Clone the Repository** \n",
" ```sh\n",
" git clone --recursive https://github.com/microsoft/BitNet.git\n",
" cd BitNet\n",
" ```\n",
"\n",
"2. **Step 1.2: Create a New Conda Environment**\n",
" ```sh\n",
" conda create -n bitnet-cpp python=3.9 -y\n",
" conda activate bitnet-cpp\n",
" ```\n",
"\n",
"3. **Install the dependencies**\n",
" ```sh\n",
" pip install -r requirements.txt\n",
" ```\n",
"\n",
"4. **Aditionally install ipykernel for Jupyter Notebook** \n",
" ```sh\n",
" pip install ipykernel \n",
" ```\n",
"\n",
"5. **Download the model from Huggingface**\n",
" ```sh\n",
" huggingface-cli download HF1BitLLM/Llama3-8B-1.58-100B-tokens --local-dir models/Llama3-8B-1.58-100B-tokens**\n",
" ```\n",
"\n",
"6. **Quantize and prepare the model**\n",
" ```sh\n",
" python setup_env.py -md models/Llama3-8B-1.58-100B-tokens -q i2_s\n",
" ```\n",
"\n",
"\n",
"7. **Other installations if required** \n",
" ```sh\n",
" conda install -n bitnet-cpp -c conda-forge cmake\n",
" sudo apt install cmake build-essential clang (for linux)\n",
" brew install cmake (macOS)\n",
" sudo apt install clang \n",
" ```\n",
"\n",
"8. **Check Clang version and are compatible**\n",
" ```sh\n",
" clang --version\n",
" clang++ --version\n",
" ```\n",
"\n",
"9. **Manually add them to your path**\n",
" ```sh\n",
" export CC=/usr/bin/clang\n",
" export CXX=/usr/bin/clang++\n",
" ```\n",
"\n",
"10. **Install libstdc++-12-dev if missing**\n",
" ```sh\n",
" sudo apt-get install libstdc++-12-dev\n",
" ```\n",
"\n",
"\n",
"\n",
"\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"# Import Necessary Libraries\n",
"import os\n",
"import subprocess\n",
"import platform\n"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"# Set Up Paths\n",
"'''Make sure you have the correct paths to the model and the executable. In Jupyter Notebook, we'll define these as variables:'''\n",
"\n",
"model_path = \"models/Llama3-8B-1.58-100B-tokens/ggml-model-i2_s.gguf\"\n",
"\n",
"# Determine the executable path\n",
"if platform.system() == \"Windows\":\n",
" main_path = os.path.join(\"build\", \"bin\", \"Release\", \"llama-cli.exe\")\n",
" if not os.path.exists(main_path):\n",
" main_path = os.path.join(\"build\", \"bin\", \"llama-cli\")\n",
"else:\n",
" main_path = os.path.join(\"build\", \"bin\", \"llama-cli\")\n"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"# Create a Function to Run Inference\n",
"'''This function replicates what run_inference.py does, but can be executed directly in the notebook''' \n",
"\n",
"def run_inference(model, prompt, n_predict=6, threads=2, ctx_size=2048, temperature=0.8):\n",
" \"\"\"Run inference with the specified arguments\"\"\"\n",
" command = [\n",
" f'{main_path}',\n",
" '-m', model,\n",
" '-n', str(n_predict),\n",
" '-t', str(threads),\n",
" '-p', prompt,\n",
" '-ngl', '0',\n",
" '-c', str(ctx_size),\n",
" '--temp', str(temperature),\n",
" \"-b\", \"1\"\n",
" ]\n",
" try:\n",
" result = subprocess.run(command, check=True, text=True, capture_output=True)\n",
" print(result.stdout)\n",
" except subprocess.CalledProcessError as e:\n",
" print(f\"Error occurred: {e}\")\n"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Daniel went back to the garden. Mary travelled to the kitchen. Sandra journeyed to the hallway. John went to the bedroom. Mary went back to the garden. Where is Mary?\n",
"Answer: Mary is in the garden.\n",
"\n",
"\n",
"\n"
]
}
],
"source": [
"# Run Inference\n",
"prompt_text = \"Daniel went back to the garden. Mary travelled to the kitchen. Sandra journeyed to the hallway. John went to the bedroom. Mary went back to the garden. Where is Mary?\\nAnswer:\"\n",
"run_inference(model_path, prompt_text, n_predict=6, temperature=0.6)\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "bitnet-cpp",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.20"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
4 changes: 4 additions & 0 deletions Bitnet_Jupyter_example.ipynb:Zone.Identifier
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
[ZoneTransfer]
ZoneId=3
ReferrerUrl=https://mail.google.com/
HostUrl=https://mail-attachment.googleusercontent.com/attachment/u/0/?ui=2&ik=e7d83ab0b4&attid=0.1&permmsgid=msg-a:r3695590145566401906&view=att&disp=safe&realattid=f_m2nh0aqo0&saddbat=ANGjdJ9wueZeoWb5UQk36r5GW1GPj-eSxcPr4tMtgLL8S7FBfzOLmEu6Ty0cNY8mbzWCinWSKhte9V0cxjNpYJ_tqcZKwLhqHKHv-aUcsDw6SbyDDtjwkiGmCJr57itmIB3_iy8XNLeZzPvVuXdPM9v_-bKKaHT8ICH8-eVAgxE3AV0eLIfa3GOxHpCGFYTTSHM_9jELvQksQMb2XQeJYeKhUq2ksVBQnsFr2cMq005HAIC2NK9gIY8F_XpErgb7Idb9K7o-vJxjZV_G2fM6EdF4Vr6HgBoJElJY-TeQ65nnB4Tvo72iHPpE5E1PNltNzasFXPM0Ann0wCSPJ7XIY3TmbRk4tS5Rr06oSY6sWifd0H5GnXYLGHAfc4bdYdmDn-246nsTry2WpWU0XGBpNQ6JltKGkM81gAC-1rhqNF4okcVXtrxISw6zNl-46n2jb8aRgxmC8DgWxbTexfSxP8EU7B44ApsI3U2aqjRc1XbYFQpeUXgrL7urPTWmp_y48R2CzflHEipaBA1_xCg-qwlXudQGwADrPTf_8rl61IorbZ9zqPwpBr_uBNAfu9sHTqDMpUGHRq3ZzFNui-OgpKU0iZEiidTaDSWFS6qkeWUI5XDoew7TBsrUjfnNQuCG_1gQG4t5XZ1XqGjmzKxyWQTNyfSsvlDBddjFVv_z7Gul3jHCz6J5xJvPoDWgZbVp1jn-HNjNXq3p3TPt6Wl_JO_4bl45D_udGCvASSCiLpJKSrYxhMK8LHnsc1qTDhzrbyytHlmW1LuNu2D4wYBBmXL1ycc0n1Q59gBRvu_D_tR4dMcnfXLmoqXErMmBHIMVyOH17lOsG8VemqAMF0YBPpbtpNXezssdZQwGgogr6DJCbJX6_v8t8u9NsCHm5Biv29J2HgUjAkNyPYHI7RwiAiJfL565CiWN1z0yCllZ35CX6Oof5Dyc2M61H7-3pJGkq_oNq5xCdFQKRWRuEr5qeUaYuIajZkCz1SFSuhmKdw