Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pandas is required for the TAPAS tokenizer #446

Open
abdullawagih1 opened this issue Nov 18, 2023 · 0 comments
Open

Pandas is required for the TAPAS tokenizer #446

abdullawagih1 opened this issue Nov 18, 2023 · 0 comments

Comments

@abdullawagih1
Copy link

I get this error when I used table-question-answering "google/tapas-base-finetuned-wtq".
ModelError: An error occurred (ModelError) when calling the InvokeEndpoint operation: Received client error (400) from primary with message "{
"code": 400,
"type": "InternalServerException",
"message": "Pandas is required for the TAPAS tokenizer."
}

This error appears when I enter the data into the predictor endpoint as a table like this:
predictor.predict({
"inputs": {
"query": "How many stars does the transformers repository have?",
"table": {
"Repository": ["Transformers", "Datasets", "Tokenizers"],
"Stars": ["36542", "4512", "3934"],
"Contributors": ["651", "77", "34"],
"Programming language": [
"Python",
"Python",
"Rust, Python and NodeJS"
]
}
},
})

So, how can I solve this issue?!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant