-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug: Dataset not found in US #1343
Comments
Hmm. I see that there is the
We happily accept Pull Requests.
Counter-intuitively, the Dockerfile and the build_image script are here: https://github.com/danicat/pipelines/tree/master/components/gcp/container |
@danicat, have you tried the dataset_location parameter of the component? Please let us know your client code if it still fails. Thanks. |
Assuming that this is resolved. Please re-open if not. |
Definitely not solved, just didn’t have time to look into this. |
@hongye-sun that parameter isn’t used for client instantiation. It’s documented as a dataset output location. But the existence of that parameter is why I say it would be a trivial change. |
I see. I didn't notice that there is a default location setting in the Client. I will set it by reusing the dataset_location. |
…ubeflow#1343) * parent a0a52f8 author Manasvi Tickoo <MTICKOO@bloomberg.net> 1611275174 -0500 committer Manasvi Tickoo <MTICKOO@bloomberg.net> 1612370131 -0500 1159 - cloudevent support ofr kfserving PR comments Fix clpudevent headers Add unit test for cloud event messages Add avro unit test Add avro to kfserving requirements.txt PR comments kfserving check ce-contenttyp before unmarshalling Update k8s libraries to 0.19.2 (kubeflow#1305) * go mod tidy * Update k8s libs to 0.19.2, fix issues, and run make tests * Handle ce to_binary return type wile building resp
I'm getting an error while running the BigQuery kfp component to export some data from BQ to GCS. It says:
google.api_core.exceptions.NotFound: 404 Not found: xxx was not found in location US
which is correct, because the dataset is in EU, but I couldn't find any way to change the default behaviour of looking for datasets in the US.On our regular codebase we specify the location on the BigQuery client initialisation, but the code here just uses the default: https://github.com/danicat/pipelines/blob/master/component_sdk/python/kfp_component/google/bigquery/_query.py
It is a one line change in the code, but I don't know how to rebuild the ml-pipeline container. The developer guide doesn't mention how to do it (but it does explain how to build containers for other components of the solution).
Please help! :)
The text was updated successfully, but these errors were encountered: