Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: Dataset not found in US #1343

Closed
danicat opened this issue May 16, 2019 · 6 comments · Fixed by #1399
Closed

Bug: Dataset not found in US #1343

danicat opened this issue May 16, 2019 · 6 comments · Fixed by #1399

Comments

@danicat
Copy link

danicat commented May 16, 2019

I'm getting an error while running the BigQuery kfp component to export some data from BQ to GCS. It says: google.api_core.exceptions.NotFound: 404 Not found: xxx was not found in location US which is correct, because the dataset is in EU, but I couldn't find any way to change the default behaviour of looking for datasets in the US.

On our regular codebase we specify the location on the BigQuery client initialisation, but the code here just uses the default: https://github.com/danicat/pipelines/blob/master/component_sdk/python/kfp_component/google/bigquery/_query.py

It is a one line change in the code, but I don't know how to rebuild the ml-pipeline container. The developer guide doesn't mention how to do it (but it does explain how to build containers for other components of the solution).

Please help! :)

@Ark-kun
Copy link
Contributor

Ark-kun commented May 17, 2019

the code here just uses the default:

Hmm. I see that there is the dataset_location parameter.
And that parameter is exposed in the component:
https://github.com/danicat/pipelines/blob/b29fbb5041433649335b2ed376940bbb300e3eef/components/gcp/bigquery/query/component.yaml#L40

It is a one line change in the code

We happily accept Pull Requests.

I don't know how to rebuild the ml-pipeline container.

Counter-intuitively, the Dockerfile and the build_image script are here: https://github.com/danicat/pipelines/tree/master/components/gcp/container

@Ark-kun Ark-kun self-assigned this May 17, 2019
@hongye-sun
Copy link
Contributor

@danicat, have you tried the dataset_location parameter of the component? Please let us know your client code if it still fails. Thanks.

@vicaire
Copy link
Contributor

vicaire commented May 28, 2019

Assuming that this is resolved. Please re-open if not.

@vicaire vicaire closed this as completed May 28, 2019
@danicat
Copy link
Author

danicat commented May 29, 2019

Definitely not solved, just didn’t have time to look into this.

@danicat
Copy link
Author

danicat commented May 29, 2019

@hongye-sun that parameter isn’t used for client instantiation. It’s documented as a dataset output location.

But the existence of that parameter is why I say it would be a trivial change.

@hongye-sun
Copy link
Contributor

I see. I didn't notice that there is a default location setting in the Client. I will set it by reusing the dataset_location.

@hongye-sun hongye-sun reopened this May 29, 2019
magdalenakuhn17 pushed a commit to magdalenakuhn17/pipelines that referenced this issue Oct 22, 2023
…ubeflow#1343)

* parent a0a52f8
author Manasvi Tickoo <MTICKOO@bloomberg.net> 1611275174 -0500
committer Manasvi Tickoo <MTICKOO@bloomberg.net> 1612370131 -0500

1159 - cloudevent support ofr kfserving

PR comments

Fix clpudevent headers

Add unit test for cloud event messages

Add avro unit test

Add avro to kfserving requirements.txt

PR comments

kfserving check ce-contenttyp before unmarshalling

Update k8s libraries to 0.19.2 (kubeflow#1305)

* go mod tidy

* Update k8s libs to 0.19.2, fix issues, and run make tests

* Handle ce to_binary return type wile building resp
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants