Skip to content

Commit 8766ac6

Browse files
authored
test: migrate e2e presubmit tests to bigframes-load-testing project (#160)
BEGIN_COMMIT_OVERRIDE fix: migrate e2e tests to bigframes-load-testing project END_COMMIT_OVERRIDE Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly: - [ ] Make sure to open an issue as a [bug/issue](https://togithub.com/googleapis/python-bigquery-dataframes/issues/new/choose) before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea - [ ] Ensure the tests and linter pass - [ ] Code coverage does not decrease (if any source code was changed) - [ ] Appropriate docs were updated (if necessary) Fixes internal issue 307809767 🦕
1 parent b02fc2c commit 8766ac6

23 files changed

+1130
-640
lines changed

.kokoro/continuous/e2e.cfg

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,3 +5,13 @@ env_vars: {
55
key: "NOX_SESSION"
66
value: "unit_prerelease system_prerelease system_noextras e2e notebook samples"
77
}
8+
9+
env_vars: {
10+
key: "GOOGLE_CLOUD_PROJECT"
11+
value: "bigframes-load-testing"
12+
}
13+
14+
env_vars: {
15+
key: "BIGFRAMES_TEST_MODEL_VERTEX_ENDPOINT"
16+
value: "https://us-central1-aiplatform.googleapis.com/v1/projects/272725758477/locations/us-central1/endpoints/590545496255234048"
17+
}

.kokoro/presubmit/e2e.cfg

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,3 +5,13 @@ env_vars: {
55
key: "NOX_SESSION"
66
value: "unit_prerelease system_prerelease system_noextras e2e notebook samples"
77
}
8+
9+
env_vars: {
10+
key: "GOOGLE_CLOUD_PROJECT"
11+
value: "bigframes-load-testing"
12+
}
13+
14+
env_vars: {
15+
key: "BIGFRAMES_TEST_MODEL_VERTEX_ENDPOINT"
16+
value: "https://us-central1-aiplatform.googleapis.com/v1/projects/272725758477/locations/us-central1/endpoints/590545496255234048"
17+
}

CONTRIBUTING.rst

Lines changed: 38 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -155,7 +155,44 @@ Running System Tests
155155
auth settings and change some configuration in your project to
156156
run all the tests.
157157

158-
- System tests will be run against an actual project. You should use local credentials from gcloud when possible. See `Best practices for application authentication <https://cloud.google.com/docs/authentication/best-practices-applications#local_development_and_testing_with_the>`__. Some tests require a service account. For those tests see `Authenticating as a service account <https://cloud.google.com/docs/authentication/production>`__.
158+
- System tests will be run against an actual project. A project can be set in
159+
the environment variable ``$GOOGLE_CLOUD_PROJECT``. If not, the project property
160+
set in the `Google Cloud CLI <https://cloud.google.com/sdk/gcloud/reference/config/get>`__
161+
will be effective, which can be peeked into via ``gcloud config get project``,
162+
or set via ``gcloud config set project <project-name>``. The following roles
163+
carry the permissions to run the system tests in the project:
164+
165+
- `BigQuery User <https://cloud.google.com/bigquery/docs/access-control#bigquery.user>`__
166+
to be able to create test datasets and run BigQuery jobs in the project.
167+
168+
- `BigQuery Connection Admin <https://cloud.google.com/bigquery/docs/access-control#bigquery.connectionAdmin>`__
169+
to be able to use BigQuery connections in the project.
170+
171+
- `BigQuery Data Editor <https://cloud.google.com/bigquery/docs/access-control#bigquery.dataEditor>`__
172+
to be able to create BigQuery remote functions in the project.
173+
174+
- `Browser <https://cloud.google.com/resource-manager/docs/access-control-proj#browser>`__
175+
to be able to get current IAM policy for the service accounts of the BigQuery connections in the project.
176+
177+
- `Cloud Functions Developer <https://cloud.google.com/functions/docs/reference/iam/roles#cloudfunctions.developer>`__
178+
to be able to create cloud functions to support BigQuery DataFrames remote functions.
179+
180+
- `Service Account User <https://cloud.google.com/iam/docs/service-account-permissions#user-role>`__
181+
to be able to use the project's service accounts.
182+
183+
- `Vertex AI User <https://cloud.google.com/vertex-ai/docs/general/access-control#aiplatform.user>`__
184+
to be able to use the BigQuery DataFrames' ML integration with Vertex AI.
185+
186+
- You can run the script ``scripts/setup-project-for-testing.sh <project-id> [<principal>]``
187+
to set up a project for running system tests and optionally set up necessary
188+
IAM roles for a principal (user/group/service-account). You need to have the following
189+
IAM permission to be able to run the set up script successfully:
190+
191+
- ``serviceusage.services.enable``
192+
- ``bigquery.connections.create``
193+
- ``resourcemanager.projects.setIamPolicy``
194+
195+
- You should use local credentials from gcloud when possible. See `Best practices for application authentication <https://cloud.google.com/docs/authentication/best-practices-applications#local_development_and_testing_with_the>`__. Some tests require a service account. For those tests see `Authenticating as a service account <https://cloud.google.com/docs/authentication/production>`__.
159196

160197
*************
161198
Test Coverage

bigframes/remote_function.py

Lines changed: 17 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -411,13 +411,23 @@ def create_cloud_function(self, def_, cf_name, package_requirements=None):
411411
create_function_request.function = function
412412

413413
# Create the cloud function and wait for it to be ready to use
414-
operation = self._cloud_functions_client.create_function(
415-
request=create_function_request
416-
)
417-
operation.result()
418-
419-
# Cleanup
420-
os.remove(archive_path)
414+
try:
415+
operation = self._cloud_functions_client.create_function(
416+
request=create_function_request
417+
)
418+
operation.result()
419+
420+
# Cleanup
421+
os.remove(archive_path)
422+
except google.api_core.exceptions.AlreadyExists:
423+
# If a cloud function with the same name already exists, let's
424+
# update it
425+
update_function_request = functions_v2.UpdateFunctionRequest()
426+
update_function_request.function = function
427+
operation = self._cloud_functions_client.update_function(
428+
request=update_function_request
429+
)
430+
operation.result()
421431

422432
# Fetch the endpoint of the just created function
423433
endpoint = self.get_cloud_function_endpoint(cf_name)

notebooks/generative_ai/large_language_models.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
"cells": [
33
{
44
"cell_type": "code",
5-
"execution_count": 1,
5+
"execution_count": null,
66
"metadata": {},
77
"outputs": [],
88
"source": [
@@ -22,12 +22,12 @@
2222
},
2323
{
2424
"cell_type": "code",
25-
"execution_count": 2,
25+
"execution_count": null,
2626
"metadata": {},
2727
"outputs": [],
2828
"source": [
2929
"session = bigframes.pandas.get_global_session()\n",
30-
"connection = \"bigframes-dev.us.bigframes-ml\""
30+
"connection = f\"{session.bqclient.project}.us.bigframes-default-connection\""
3131
]
3232
},
3333
{

0 commit comments

Comments
 (0)