-
Notifications
You must be signed in to change notification settings - Fork 138
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] post_process_function of painless script does not work while creating externally hosted models using http #1990
Comments
I think this issue could be fixed in 2.12. But before make that promise, I want to do some test to confirm first. |
Here are the detailed steps @ylwu-amzn
Other Steps : Registered the Model and Deployed the Model Step: Did the predict and it worked successful
Step : Created a RAG pipeline
Step : Did the Simple Search with RAG and it failed
So that is the reason then used |
@ramda1234786 Thanks can you share examples of the model raw output and the expected output ? |
Hi @ylwu-amzn , with predict API i am getting this response
Expected Result is that without using so i have used the
I have tried this but this below one but it is not working
|
@ylwu-amzn If you try to use Painless to convert a JSON payload to another JSON, Java somehow can't seem to be able to recognize that output as JSON. |
Got it, @ramda1234786 , can you try this
|
I think mostly that caused by a wrong painless. Do you have some example? |
|
@ylwu-amzn ^^ |
Is it really |
The way I am testing is calling OpenAI and just rewriting the response. |
well, that didn't work either. |
@austintlee , can you share the connector config , and the model raw output without post-process and the expected output you want with the post-process ? I can help do some testing to check if that's some bug or not. |
@ramda1234786 , have you tested this post process function ? #1990 (comment) |
Hi @ylwu-amzn i tried this script mentioned here
Something worked here, i am getting response here like this
But the Expectation is to get this. But the painless script is really a pain
|
Hi @ylwu-amzn even though i am not getting the expected output in _predict API, but my RAG search is working now, i am getting the response. So this is a great break through But i am not sure why this extra stuff came of
|
@ramda1234786 That's caused by some escaping issue, you can find escape method in my PR #2075 |
Hi @ylwu-amzn , i tested this and it worked perfectly. Thanks for this and we can close this |
What is the bug?
post_process_function of painless script does not work while creating externally hosted models using http.
This shows up in predict API
How can one reproduce the bug?
Take any text2text generation inference API endpoint model from Hugging Face.
Create http request and add below mentioned
post_process_function
but it does not work as expected.The HF Models comes with response like this
So we need to translate this using
post_process_function
"post_process_function": "\n return params['response'][0].generated_text; \n"
I have this
and i want to convert it to this below using post process function
below is the
post_process_function
"post_process_function": "\n def json = \"{\" +\n \"\\\"completion\\\":\\\"\" + params['response'][0].generated_text + \"\\\" }\";\n return json;\n "
But it never works with _predict APIs
The text was updated successfully, but these errors were encountered: