Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

server : (embeddings) using same format for "input" and "content" #10872

Merged
merged 4 commits into from
Dec 18, 2024

Conversation

ngxson
Copy link
Collaborator

@ngxson ngxson commented Dec 17, 2024

Supersede #10866

"input" and "content" now using the same format. Also added test cases.

@ngxson ngxson requested a review from ggerganov December 17, 2024 20:37
} else {
res_error(res, format_error_response("\"input\" or \"content\" must be provided", ERROR_TYPE_INVALID_REQUEST));
return;
}

std::vector<llama_tokens> tokenized_prompts = tokenize_input_prompts(ctx_server.ctx, prompt, true, true);
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ggerganov I changed the add_special to true here, because I remember that embedding models need BOS token. Not sure why it's removed at some point, maybe due to human error (my error 👀 ?) during recent refactoring.

@ggerganov ggerganov mentioned this pull request Dec 17, 2024
9 tasks
@github-actions github-actions bot added examples python python script changes server labels Dec 17, 2024
@ggerganov ggerganov merged commit 4682887 into ggerganov:master Dec 18, 2024
50 checks passed
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Dec 20, 2024
…erganov#10872)

* server : (embeddings) using same format for "input" and "content"

* fix test case

* handle empty input case

* fix test
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
examples python python script changes server
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants