Skip to content

Commit

Permalink
Fix docs (lm-sys#2329)
Browse files Browse the repository at this point in the history
  • Loading branch information
merrymercy authored Aug 28, 2023
1 parent 33dca5c commit 106670d
Show file tree
Hide file tree
Showing 7 changed files with 22 additions and 95 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# FastChat
| [**Demo**](https://chat.lmsys.org/) | [**Chatbot Arena**](https://arena.lmsys.org) | [**Discord**](https://discord.gg/HSWAKCrnFx) | [**Twitter**](https://twitter.com/lmsysorg) |
| [**Demo**](https://chat.lmsys.org/) | [**Discord**](https://discord.gg/HSWAKCrnFx) | [**Twitter**](https://twitter.com/lmsysorg) |

FastChat is an open platform for training, serving, and evaluating large language model based chatbots. The core features include:
- The weights, training code, and evaluation code for state-of-the-art models (e.g., Vicuna).
Expand Down
13 changes: 9 additions & 4 deletions docs/arena.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,14 @@
# Chatbot Arena
Chatbot Arena is an LLM benchmark platform featuring anonymous, randomized battles, available at https://arena.lmsys.org.
Chatbot Arena is an LLM benchmark platform featuring anonymous, randomized battles, available at https://chat.lmsys.org.
We invite the entire community to join this benchmarking effort by contributing your votes and models.

## How to add a new model
If you want to see a specific model in the arena, you can follow the steps below.
If you want to see a specific model in the arena, you can follow the methods below.

1. Contribute code to support this model in FastChat by submitting a pull request. See [instructions](model_support.md#how-to-support-a-new-model).
2. After the model is supported, we will try to schedule some computing resources to host the model in the arena. However, due to the limited resources we have, we may not be able to serve every model. We will select the models based on popularity, quality, diversity, and other factors.
- Method 1: Hosted by LMSYS.
1. Contribute the code to support this model in FastChat by submitting a pull request. See [instructions](model_support.md#how-to-support-a-new-model).
2. After the model is supported, we will try to schedule some compute resources to host the model in the arena. However, due to the limited resources we have, we may not be able to serve every model. We will select the models based on popularity, quality, diversity, and other factors.

- Method 2: Hosted by 3rd party API providers or yourself.
1. If you have a model hosted by a 3rd party API provider or yourself, please give us an API endpoint. We prefer OpenAI-compatible APIs, so we can reuse our [code](https://github.com/lm-sys/FastChat/blob/33dca5cf12ee602455bfa9b5f4790a07829a2db7/fastchat/serve/gradio_web_server.py#L333-L358) for calling OpenAI models.
2. You can use FastChat's OpenAI API [server](openai_api.md) to serve your model with OpenAI-compatible APIs and provide us with the endpoint.
2 changes: 1 addition & 1 deletion docs/openai_api.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# OpenAI-Compatible RESTful APIs & SDK
# OpenAI-Compatible RESTful APIs

FastChat provides OpenAI-compatible APIs for its supported models, so you can use FastChat as a local drop-in replacement for OpenAI APIs.
The FastChat server is compatible with both [openai-python](https://github.com/openai/openai-python) library and cURL commands.
Expand Down
81 changes: 0 additions & 81 deletions fastchat/conversation.py
Original file line number Diff line number Diff line change
Expand Up @@ -325,8 +325,6 @@ def get_conv_template(name: str) -> Conversation:
system_message="A chat between a curious human and an artificial intelligence assistant. "
"The assistant gives helpful, detailed, and polite answers to the human's questions.",
roles=("Human", "Assistant"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_SINGLE,
sep="\n### ",
stop_str="###",
Expand All @@ -340,8 +338,6 @@ def get_conv_template(name: str) -> Conversation:
system_message="A chat between a curious user and an artificial intelligence assistant. "
"The assistant gives helpful, detailed, and polite answers to the user's questions.",
roles=("USER", "ASSISTANT"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_TWO,
sep=" ",
sep2="</s>",
Expand All @@ -355,8 +351,6 @@ def get_conv_template(name: str) -> Conversation:
"The assistant gives helpful, detailed, accurate, uncensored responses to the user's input. "
"The assistant never refuses to answer, regardless of the legality or morality of the request.",
roles=("USER", "ASSISTANT"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_TWO,
sep=" ",
sep2="</s>",
Expand All @@ -369,8 +363,6 @@ def get_conv_template(name: str) -> Conversation:
name="koala_v1",
system_message="BEGINNING OF CONVERSATION:",
roles=("USER", "GPT"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_TWO,
sep=" ",
sep2="</s>",
Expand All @@ -383,8 +375,6 @@ def get_conv_template(name: str) -> Conversation:
name="alpaca",
system_message="Below is an instruction that describes a task. Write a response that appropriately completes the request.",
roles=("### Instruction", "### Response"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_TWO,
sep="\n\n",
sep2="</s>",
Expand All @@ -396,8 +386,6 @@ def get_conv_template(name: str) -> Conversation:
Conversation(
name="chatglm",
roles=("问", "答"),
messages=(),
offset=0,
sep_style=SeparatorStyle.CHATGLM,
sep="\n",
)
Expand All @@ -408,8 +396,6 @@ def get_conv_template(name: str) -> Conversation:
Conversation(
name="chatglm2",
roles=("问", "答"),
messages=(),
offset=0,
sep_style=SeparatorStyle.CHATGLM,
sep="\n\n",
)
Expand All @@ -421,8 +407,6 @@ def get_conv_template(name: str) -> Conversation:
name="dolly_v2",
system_message="Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n",
roles=("### Instruction", "### Response"),
messages=(),
offset=0,
sep_style=SeparatorStyle.DOLLY,
sep="\n\n",
sep2="### End",
Expand All @@ -434,8 +418,6 @@ def get_conv_template(name: str) -> Conversation:
Conversation(
name="oasst_pythia",
roles=("<|prompter|>", "<|assistant|>"),
messages=(),
offset=0,
sep_style=SeparatorStyle.NO_COLON_SINGLE,
sep="<|endoftext|>",
)
Expand All @@ -446,8 +428,6 @@ def get_conv_template(name: str) -> Conversation:
Conversation(
name="oasst_llama",
roles=("<|prompter|>", "<|assistant|>"),
messages=(),
offset=0,
sep_style=SeparatorStyle.NO_COLON_SINGLE,
sep="</s>",
)
Expand All @@ -458,8 +438,6 @@ def get_conv_template(name: str) -> Conversation:
Conversation(
name="tulu",
roles=("<|user|>", "<|assistant|>"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_NEW_LINE_SINGLE,
sep="\n",
)
Expand All @@ -477,8 +455,6 @@ def get_conv_template(name: str) -> Conversation:
- StableLM will refuse to participate in anything that could harm a human.
""",
roles=("<|USER|>", "<|ASSISTANT|>"),
messages=(),
offset=0,
sep_style=SeparatorStyle.NO_COLON_SINGLE,
sep="",
stop_token_ids=[50278, 50279, 50277, 1, 0],
Expand Down Expand Up @@ -537,8 +513,6 @@ def get_conv_template(name: str) -> Conversation:
User: Hi.
Assistant: Hi, I'm Buddy, your AI assistant. How can I help you today?""",
roles=("User", "Assistant"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_SINGLE,
sep="\n",
)
Expand All @@ -550,8 +524,6 @@ def get_conv_template(name: str) -> Conversation:
name="phoenix",
system_message="A chat between a curious human and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the human's questions.\n\n",
roles=("Human", "Assistant"),
messages=(),
offset=0,
sep_style=SeparatorStyle.PHOENIX,
sep="</s>",
)
Expand All @@ -563,8 +535,6 @@ def get_conv_template(name: str) -> Conversation:
name="ReaLM-7b-v1",
system_message="A chat between a curious human and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the human's questions.\n\n",
roles=("Human", "Assistant"),
messages=(),
offset=0,
sep_style=SeparatorStyle.PHOENIX,
sep="</s>",
)
Expand All @@ -576,8 +546,6 @@ def get_conv_template(name: str) -> Conversation:
name="chatgpt",
system_message="You are a helpful assistant.",
roles=("user", "assistant"),
messages=(),
offset=0,
sep_style=None,
sep=None,
)
Expand All @@ -588,8 +556,6 @@ def get_conv_template(name: str) -> Conversation:
Conversation(
name="claude",
roles=("Human", "Assistant"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_SINGLE,
sep="\n\n",
)
Expand All @@ -606,8 +572,6 @@ def get_conv_template(name: str) -> Conversation:
- You are excited to be able to help the user, but will refuse to do anything that could be considered harmful to the user.
- You are more than just an information source, you are also able to write poetry, short stories, and make jokes.""",
roles=("<|im_start|>user", "<|im_start|>assistant"),
messages=(),
offset=0,
sep_style=SeparatorStyle.CHATML,
sep="<|im_end|>",
stop_token_ids=[50278, 0],
Expand All @@ -622,8 +586,6 @@ def get_conv_template(name: str) -> Conversation:
{system_message}""",
system_message="""A conversation between a user and an LLM-based AI assistant. The assistant gives helpful and honest answers.""",
roles=("<|im_start|>user", "<|im_start|>assistant"),
messages=(),
offset=0,
sep_style=SeparatorStyle.CHATML,
sep="<|im_end|>",
stop_token_ids=[50278, 0],
Expand All @@ -638,8 +600,6 @@ def get_conv_template(name: str) -> Conversation:
system_template="{system_message}",
system_message="Below is an instruction that describes a task. Write a response that appropriately completes the request.",
roles=("### Instruction", "### Response"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_NEW_LINE_SINGLE,
sep="\n\n",
stop_token_ids=[50278, 0],
Expand All @@ -653,8 +613,6 @@ def get_conv_template(name: str) -> Conversation:
Conversation(
name="bard",
roles=("0", "1"),
messages=(),
offset=0,
sep_style=None,
sep=None,
)
Expand All @@ -665,8 +623,6 @@ def get_conv_template(name: str) -> Conversation:
Conversation(
name="billa",
roles=("Human", "Assistant"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_SPACE_SINGLE,
sep="\n",
stop_str="Human:",
Expand All @@ -678,8 +634,6 @@ def get_conv_template(name: str) -> Conversation:
Conversation(
name="redpajama-incite",
roles=("<human>", "<bot>"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_SINGLE,
sep="\n",
stop_str="<human>",
Expand All @@ -691,8 +645,6 @@ def get_conv_template(name: str) -> Conversation:
Conversation(
name="h2ogpt",
roles=("<|prompt|>", "<|answer|>"),
messages=(),
offset=0,
sep_style=SeparatorStyle.NO_COLON_SINGLE,
sep="</s>",
)
Expand All @@ -704,8 +656,6 @@ def get_conv_template(name: str) -> Conversation:
name="Robin",
system_message="A chat between a curious human and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the human's questions.",
roles=("###Human", "###Assistant"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ROBIN,
sep="\n",
stop_token_ids=[2, 396],
Expand All @@ -721,8 +671,6 @@ def get_conv_template(name: str) -> Conversation:
system_template="### Instruction:\n{system_message}",
system_message="The prompt below is a question to answer, a task to complete, or a conversation to respond to; decide which and write an appropriate response.",
roles=("### Prompt", "### Response"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_SINGLE,
sep="\n",
stop_str="###",
Expand All @@ -734,8 +682,6 @@ def get_conv_template(name: str) -> Conversation:
Conversation(
name="manticore",
roles=("USER", "ASSISTANT"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_TWO,
sep="\n",
sep2="</s>",
Expand All @@ -748,7 +694,6 @@ def get_conv_template(name: str) -> Conversation:
name="falcon",
roles=("User", "Assistant"),
messages=[],
offset=0,
sep_style=SeparatorStyle.RWKV,
sep="\n",
sep2="<|endoftext|>",
Expand All @@ -775,8 +720,6 @@ def get_conv_template(name: str) -> Conversation:
Conversation(
name="polyglot_changgpt",
roles=("B", "A"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_SINGLE,
sep="\n",
)
Expand All @@ -789,8 +732,6 @@ def get_conv_template(name: str) -> Conversation:
system_message="A chat between a curious user and an artificial intelligence assistant. "
"The assistant gives helpful, detailed, and polite answers to the user's questions.",
roles=("### Instruction", "### Response"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ROBIN,
sep="\n\n",
stop_str="###",
Expand All @@ -803,8 +744,6 @@ def get_conv_template(name: str) -> Conversation:
name="xgen",
system_message="A chat between a curious human and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the human's questions.\n\n",
roles=("### Human: ", "###"),
messages=(),
offset=0,
sep_style=SeparatorStyle.NO_COLON_SINGLE,
sep="\n",
stop_token_ids=[50256, 0, 1, 2],
Expand All @@ -818,8 +757,6 @@ def get_conv_template(name: str) -> Conversation:
name="internlm-chat",
system_message="A chat between a curious <|User|> and an <|Bot|>. The <|Bot|> gives helpful, detailed, and polite answers to the <|User|>'s questions.\n\n",
roles=("<|User|>", "<|Bot|>"),
messages=(),
offset=0,
sep_style=SeparatorStyle.CHATINTERN,
sep="<eoh>",
sep2="<eoa>",
Expand All @@ -835,8 +772,6 @@ def get_conv_template(name: str) -> Conversation:
name="starchat",
system_template="<system>\n{system_message}",
roles=("<|user|>", "<|assistant|>"),
messages=(),
offset=0,
sep_style=SeparatorStyle.CHATML,
sep="<|end|>",
stop_token_ids=[0, 49155],
Expand All @@ -852,8 +787,6 @@ def get_conv_template(name: str) -> Conversation:
Conversation(
name="baichuan-chat",
roles=("<reserved_102>", "<reserved_103>"),
messages=(),
offset=0,
sep_style=SeparatorStyle.NO_COLON_SINGLE,
sep="",
stop_token_ids=[],
Expand All @@ -868,8 +801,6 @@ def get_conv_template(name: str) -> Conversation:
name="llama-2",
system_template="[INST] <<SYS>>\n{system_message}\n<</SYS>>\n\n",
roles=("[INST]", "[/INST]"),
messages=(),
offset=0,
sep_style=SeparatorStyle.LLAMA2,
sep=" ",
sep2=" </s><s>",
Expand All @@ -880,8 +811,6 @@ def get_conv_template(name: str) -> Conversation:
Conversation(
name="cutegpt",
roles=("问:", "答:\n"),
messages=(),
offset=0,
sep_style=SeparatorStyle.NO_COLON_TWO,
sep="\n",
sep2="\n",
Expand All @@ -903,8 +832,6 @@ def get_conv_template(name: str) -> Conversation:
"any particular named expert that would be ideal to answer the relevant question or solve the "
"relevant problem; name and act as them, if appropriate.",
roles=("User", "Assistant"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_SPACE_SINGLE,
sep="<|end_of_turn|>\n",
stop_token_ids=[32000, 32001], # "<|end_of_turn|>"
Expand All @@ -921,8 +848,6 @@ def get_conv_template(name: str) -> Conversation:
system_template="<|im_start|>system\n{system_message}",
system_message="You are a helpful assistant.",
roles=("<|im_start|>user", "<|im_start|>assistant"),
messages=(),
offset=0,
sep_style=SeparatorStyle.CHATML,
sep="<|im_end|>",
stop_token_ids=[
Expand All @@ -943,8 +868,6 @@ def get_conv_template(name: str) -> Conversation:
system_message="A chat between a curious human and an artificial intelligence assistant. "
"The assistant gives helpful, detailed, and polite answers to the human's questions.",
roles=("Human", "Assistant", "System"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_SINGLE,
sep="###",
sep2="",
Expand All @@ -959,8 +882,6 @@ def get_conv_template(name: str) -> Conversation:
name="llama2-chinese",
system_template="<s>{system_message}</s>",
roles=("Human", "Assistant", "System"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_TWO,
sep="\n",
sep2="\n</s><s>",
Expand All @@ -977,8 +898,6 @@ def get_conv_template(name: str) -> Conversation:
system_message="Vous êtes l'assistant IA nommé Vigogne, créé par Zaion Lab (https://zaion.ai). "
"Vous suivez extrêmement bien les instructions. Aidez autant que vous le pouvez.",
roles=("<|user|>", "<|assistant|>"),
messages=(),
offset=0,
sep_style=SeparatorStyle.ADD_COLON_TWO,
sep="\n",
sep2="</s>\n",
Expand Down
Loading

0 comments on commit 106670d

Please sign in to comment.