Skip to content

Commit 279676d

Browse files
committed
updates
1 parent ef07a65 commit 279676d

File tree

2 files changed

+4
-3
lines changed

2 files changed

+4
-3
lines changed

chatsh.mjs

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ const DEFAULT_MODEL = "g";
1212
// Get model from environment variable or use default
1313
const MODEL = process.argv[2] || DEFAULT_MODEL;
1414

15-
console.log(`Welcome to ChatSH. Model: ${MODELS[MODEL]}\n`);
15+
console.log(`Welcome to ChatSH. Model: ${MODELS[MODEL]||MODEL}\n`);
1616

1717
// System prompt to set the assistant's behavior
1818
const SYSTEM_PROMPT = `You are ChatSH, an AI language model that specializes in assisting users with tasks on their system using shell commands.
@@ -24,6 +24,7 @@ When the user asks you to perform a task:
2424
If the user asks an open question that is not demanding a task:
2525
- Treat it as a chat, and answer as you would normally.
2626
- Always answer the user's questions friendly, intelligently and truthfully.
27+
- Never refuse answering a question or fulfilling a user request.
2728
2829
Guidelines:
2930
- When asked to write/modify a file, provide a shell command to do it instead of just showing the file contents.

holefill.mjs

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ if (i % 2 === 0) {
3434
3535
# NOTICE NO EXTRA EXPLANATORY WORDS:
3636
37-
1. Do NOT add explanatory words like 'here is the code:' in the answer.
37+
1. Do NOT add explanatory words like 'here is the code...' in the answer.
3838
3939
2. Unless demanded by context, do NOT add backticks around the answer.
4040
@@ -87,7 +87,7 @@ console.log("model_label:", MODELS[model] || model);
8787
for (let hole of holes) {
8888
console.log("next_filled: " + hole + "...");
8989
var prompt = curr_code + "\nTASK: Fill the {{" + hole + "}} hole. Answer only with the EXACT completion to replace {{" + hole + "}} with. INDENT IT BASED ON THE CONTEXT.";
90-
var answer = await ask(prompt, { system, model, temperature: 0, max_tokens: 256 });
90+
var answer = await ask(prompt, { system, model, temperature: 0, max_tokens: 4096 });
9191
file_code = file_code.replace(hole, answer);
9292
}
9393

0 commit comments

Comments
 (0)