Skip to content

Commit

Permalink
chore: sync v3 with master branch (#880)
Browse files Browse the repository at this point in the history
* chore: protect v3.x.x branch (#816)

* chore: protect v3.x.x branch

In preparation for breaking changes.

* force pattern to be a string

* simplify branch name

* fix: no longer raise a warning in `to_dataframe` if `max_results` set (#815)

That warning should only be used when BQ Storage client is
explicitly passed in to RowIterator methods when max_results
value is also set.

* feat: Update proto definitions for bigquery/v2 to support new proto fields for BQML. (#817)

PiperOrigin-RevId: 387137741

Source-Link: googleapis/googleapis@8962c92

Source-Link: googleapis/googleapis-gen@102f1b4

* chore: release 2.23.0 (#819)

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>

* chore(deps): update dependency google-cloud-bigquery to v2.23.0 (#820)

* fix: `insert_rows()` accepts float column values as strings again (#824)

* chore: release 2.23.1 (#825)

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>

* chore: add second protection rule for v3 branch (#828)

* chore(deps): update dependency google-cloud-bigquery to v2.23.1 (#827)

* test: retry getting rows after streaming them in `test_insert_rows_from_dataframe` (#832)

* chore(deps): update dependency pyarrow to v5 (#834)

* chore(deps): update dependency google-cloud-bigquery-storage to v2.6.2 (#795)

* deps: expand pyarrow pins to support 5.x releases (#833)

* chore: release 2.23.2 (#835)

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>

* chore(deps): update dependency google-auth-oauthlib to v0.4.5 (#839)

* chore(deps): update dependency google-cloud-bigquery to v2.23.2 (#838)

* chore(deps): update dependency google-cloud-testutils to v1 (#845)

* chore: require CODEOWNER review and up to date branches (#846)

These two lines bring the rules on this repo in line with the defaults:

https://github.com/googleapis/repo-automation-bots/blob/63c858e539e1f4d9bb8ea66e12f9c0a0de5fef55/packages/sync-repo-settings/src/required-checks.json#L40-L50

* chore: add api-bigquery as a samples owner (#852)

* fix: increase default retry deadline to 10 minutes (#859)

The backend API has a timeout of 4 minutes, so the default of 2 minutes was not
allowing for any retries to happen in some cases.

Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly:
- [ ] Make sure to open an issue as a [bug/issue](https://github.com/googleapis/python-bigquery/issues/new/choose) before writing your code!  That way we can discuss the change, evaluate designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)

Fixes #853  🦕

* process: add yoshi-python to samples CODEOWNERS (#858)

Closes #857.

* chore: release 2.23.3 (#860)

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Tim Swast <swast@google.com>

* chore(deps): update dependency google-cloud-bigquery to v2.23.3 (#866)

[![WhiteSource Renovate](https://app.renovatebot.com/images/banner.svg)](https://renovatebot.com)

This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
| [google-cloud-bigquery](https://togithub.com/googleapis/python-bigquery) | `==2.23.2` -> `==2.23.3` | [![age](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery/2.23.3/age-slim)](https://docs.renovatebot.com/merge-confidence/) | [![adoption](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery/2.23.3/adoption-slim)](https://docs.renovatebot.com/merge-confidence/) | [![passing](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery/2.23.3/compatibility-slim/2.23.2)](https://docs.renovatebot.com/merge-confidence/) | [![confidence](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery/2.23.3/confidence-slim/2.23.2)](https://docs.renovatebot.com/merge-confidence/) |

***

### Release Notes

<details>
<summary>googleapis/python-bigquery</summary>

### [`v2.23.3`](https://togithub.com/googleapis/python-bigquery/blob/master/CHANGELOG.md#​2233-httpswwwgithubcomgoogleapispython-bigquerycomparev2232v2233-2021-08-06)

[Compare Source](https://togithub.com/googleapis/python-bigquery/compare/v2.23.2...v2.23.3)

</details>

***

### Configuration

📅 **Schedule**: At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update again.

***

*   \[ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box.

***

This PR has been generated by [WhiteSource Renovate](https://renovate.whitesourcesoftware.com). View repository job log [here](https://app.renovatebot.com/dashboard#github/googleapis/python-bigquery).

* feat: add support for transaction statistics (#849)

* feat: add support for transaction statistics

* Hoist transaction_info into base job class

* Add versionadded directive to new property and class

* Include new class in docs reference

* chore(deps): update dependency google-cloud-bigquery-storage to v2.6.3 (#863)

[![WhiteSource Renovate](https://app.renovatebot.com/images/banner.svg)](https://renovatebot.com)

This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
| [google-cloud-bigquery-storage](https://togithub.com/googleapis/python-bigquery-storage) | `==2.6.2` -> `==2.6.3` | [![age](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery-storage/2.6.3/age-slim)](https://docs.renovatebot.com/merge-confidence/) | [![adoption](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery-storage/2.6.3/adoption-slim)](https://docs.renovatebot.com/merge-confidence/) | [![passing](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery-storage/2.6.3/compatibility-slim/2.6.2)](https://docs.renovatebot.com/merge-confidence/) | [![confidence](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery-storage/2.6.3/confidence-slim/2.6.2)](https://docs.renovatebot.com/merge-confidence/) |

***

### Release Notes

<details>
<summary>googleapis/python-bigquery-storage</summary>

### [`v2.6.3`](https://togithub.com/googleapis/python-bigquery-storage/blob/master/CHANGELOG.md#​263-httpswwwgithubcomgoogleapispython-bigquery-storagecomparev262v263-2021-08-06)

[Compare Source](https://togithub.com/googleapis/python-bigquery-storage/compare/v2.6.2...v2.6.3)

</details>

***

### Configuration

📅 **Schedule**: At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update again.

***

*   \[x] <!-- rebase-check -->If you want to rebase/retry this PR, check this box.

***

This PR has been generated by [WhiteSource Renovate](https://renovate.whitesourcesoftware.com). View repository job log [here](https://app.renovatebot.com/dashboard#github/googleapis/python-bigquery).

* chore: fix INSTALL_LIBRARY_FROM_SOURCE in noxfile.py (#869)

Source-Link: googleapis/synthtool@6252f2c
Post-Processor: gcr.io/repo-automation-bots/owlbot-python:latest@sha256:50e35228649c47b6ca82aa0be3ff9eb2afce51c82b66c4a03fe4afeb5ff6c0fc

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>

* feat: make the same `Table*` instances equal to each other (#867)

* feat: make the same Table instances equal to each other

* Table equality should ignore metadata differences

* Compare instances through tableReference property

* Make Table instances hashable

* Make Table* classes interchangeable

If these classes reference the same table, they are now considered equal.

* feat: support `ScalarQueryParameterType` for `type_` argument in `ScalarQueryParameter` constructor (#850)

Follow-up to https://github.com/googleapis/python-bigquery/pull/840/files#r679880582

Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly:
- [ ] Make sure to open an issue as a [bug/issue](https://github.com/googleapis/python-bigquery/issues/new/choose) before writing your code!  That way we can discuss the change, evaluate designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)

* feat: retry failed query jobs in `result()` (#837)

Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly:
- [x] Make sure to open an issue as a [bug/issue](https://github.com/googleapis/python-bigquery/issues/new/choose) before writing your code!  That way we can discuss the change, evaluate designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)

Fixes #539  🦕

Previously, we only retried failed API requests. Now, we retry failed jobs (according to the predicate of the `Retry` object passed to `job.result()`).

* fix: make unicode characters working well in load_table_from_json (#865)

Co-authored-by: Tim Swast <swast@google.com>
Co-authored-by: Tres Seaver <tseaver@palladion.com>

* chore: release 2.24.0 (#868)

:robot: I have created a release \*beep\* \*boop\*
---
## [2.24.0](https://www.github.com/googleapis/python-bigquery/compare/v2.23.3...v2.24.0) (2021-08-11)


### Features

* add support for transaction statistics ([#849](https://www.github.com/googleapis/python-bigquery/issues/849)) ([7f7b1a8](https://www.github.com/googleapis/python-bigquery/commit/7f7b1a808d50558772a0deb534ca654da65d629e))
* make the same `Table*` instances equal to each other ([#867](https://www.github.com/googleapis/python-bigquery/issues/867)) ([c1a3d44](https://www.github.com/googleapis/python-bigquery/commit/c1a3d4435739a21d25aa154145e36d3a7c42eeb6))
* retry failed query jobs in `result()` ([#837](https://www.github.com/googleapis/python-bigquery/issues/837)) ([519d99c](https://www.github.com/googleapis/python-bigquery/commit/519d99c20e7d1101f76981f3de036fdf3c7a4ecc))
* support `ScalarQueryParameterType` for `type_` argument in `ScalarQueryParameter` constructor ([#850](https://www.github.com/googleapis/python-bigquery/issues/850)) ([93d15e2](https://www.github.com/googleapis/python-bigquery/commit/93d15e2e5405c2cc6d158c4e5737361344193dbc))


### Bug Fixes

* make unicode characters working well in load_table_from_json ([#865](https://www.github.com/googleapis/python-bigquery/issues/865)) ([ad9c802](https://www.github.com/googleapis/python-bigquery/commit/ad9c8026f0e667f13dd754279f9dc40d06f4fa78))
---


This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please).

* chore(deps): update dependency google-cloud-bigquery to v2.24.0 (#873)

* test: refactor `list_rows` tests and add test for scalars (#829)

* test: refactor `list_rows` tests and add test for scalars

* fix JSON formatting

* add TODO for INTERVAL Arrow support

* format tests

* chore: drop mention of Python 2.7 from templates (#877)

Source-Link: googleapis/synthtool@facee4c
Post-Processor: gcr.io/repo-automation-bots/owlbot-python:latest@sha256:9743664022bd63a8084be67f144898314c7ca12f0a03e422ac17c733c129d803

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>

* fix: remove pytz dependency and require pyarrow>=3.0.0 (#875)

* fix: remove pytz dependency

* 🦉 Updates from OwlBot

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* fix(deps): require pyarrow>=3.0.0

* remove version check for pyarrow

* require pyarrow 3.0 in pandas extra

* remove _BIGNUMERIC_SUPPORT references from tests

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Co-authored-by: Dina Graves Portman <dinagraves@google.com>
Co-authored-by: Tim Swast <swast@google.com>

* Update google/cloud/bigquery/table.py

* tests: avoid INTERVAL columns in pandas tests

Co-authored-by: Peter Lamut <plamut@users.noreply.github.com>
Co-authored-by: gcf-owl-bot[bot] <78513119+gcf-owl-bot[bot]@users.noreply.github.com>
Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: WhiteSource Renovate <bot@renovateapp.com>
Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com>
Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Co-authored-by: Jim Fulton <jim@jimfulton.info>
Co-authored-by: Grimmer <grimmer0125@gmail.com>
Co-authored-by: Tres Seaver <tseaver@palladion.com>
Co-authored-by: Dina Graves Portman <dinagraves@google.com>
  • Loading branch information
11 people authored Aug 16, 2021
1 parent dcd78c7 commit 60e73fe
Show file tree
Hide file tree
Showing 17 changed files with 268 additions and 124 deletions.
5 changes: 3 additions & 2 deletions docs/snippets.py
Original file line number Diff line number Diff line change
Expand Up @@ -359,7 +359,6 @@ def test_update_table_expiration(client, to_delete):

# [START bigquery_update_table_expiration]
import datetime
import pytz

# from google.cloud import bigquery
# client = bigquery.Client()
Expand All @@ -371,7 +370,9 @@ def test_update_table_expiration(client, to_delete):
assert table.expires is None

# set table to expire 5 days from now
expiration = datetime.datetime.now(pytz.utc) + datetime.timedelta(days=5)
expiration = datetime.datetime.now(datetime.timezone.utc) + datetime.timedelta(
days=5
)
table.expires = expiration
table = client.update_table(table, ["expires"]) # API request

Expand Down
4 changes: 1 addition & 3 deletions google/cloud/bigquery/table.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,6 @@
import datetime
import functools
import operator
import pytz
import typing
from typing import Any, Dict, Iterable, Iterator, Optional, Tuple
import warnings
Expand Down Expand Up @@ -1727,7 +1726,6 @@ def to_arrow(
.. versionadded:: 1.17.0
"""
self._maybe_warn_max_results(bqstorage_client)

if not self._validate_bqstorage(bqstorage_client, create_bqstorage_client):
create_bqstorage_client = False
bqstorage_client = None
Expand Down Expand Up @@ -1946,7 +1944,7 @@ def to_dataframe(
# Pandas, we set the timestamp_as_object parameter to True, if necessary.
types_to_check = {
pyarrow.timestamp("us"),
pyarrow.timestamp("us", tz=pytz.UTC),
pyarrow.timestamp("us", tz=datetime.timezone.utc),
}

for column in record_batch:
Expand Down
3 changes: 1 addition & 2 deletions samples/client_query_w_timestamp_params.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,6 @@ def client_query_w_timestamp_params():
# [START bigquery_query_params_timestamps]
import datetime

import pytz
from google.cloud import bigquery

# Construct a BigQuery client object.
Expand All @@ -30,7 +29,7 @@ def client_query_w_timestamp_params():
bigquery.ScalarQueryParameter(
"ts_value",
"TIMESTAMP",
datetime.datetime(2016, 12, 7, 8, 0, tzinfo=pytz.UTC),
datetime.datetime(2016, 12, 7, 8, 0, tzinfo=datetime.timezone.utc),
)
]
)
Expand Down
6 changes: 3 additions & 3 deletions samples/geography/noxfile.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@

TEST_CONFIG = {
# You can opt out from the test for specific Python versions.
"ignored_versions": ["2.7"],
"ignored_versions": [],
# Old samples are opted out of enforcing Python type hints
# All new samples should feature them
"enforce_type_hints": False,
Expand Down Expand Up @@ -86,8 +86,8 @@ def get_pytest_env_vars() -> Dict[str, str]:


# DO NOT EDIT - automatically generated.
# All versions used to tested samples.
ALL_VERSIONS = ["2.7", "3.6", "3.7", "3.8", "3.9"]
# All versions used to test samples.
ALL_VERSIONS = ["3.6", "3.7", "3.8", "3.9"]

# Any default versions that should be ignored.
IGNORED_VERSIONS = TEST_CONFIG["ignored_versions"]
Expand Down
6 changes: 3 additions & 3 deletions samples/snippets/noxfile.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@

TEST_CONFIG = {
# You can opt out from the test for specific Python versions.
"ignored_versions": ["2.7"],
"ignored_versions": [],
# Old samples are opted out of enforcing Python type hints
# All new samples should feature them
"enforce_type_hints": False,
Expand Down Expand Up @@ -86,8 +86,8 @@ def get_pytest_env_vars() -> Dict[str, str]:


# DO NOT EDIT - automatically generated.
# All versions used to tested samples.
ALL_VERSIONS = ["2.7", "3.6", "3.7", "3.8", "3.9"]
# All versions used to test samples.
ALL_VERSIONS = ["3.6", "3.7", "3.8", "3.9"]

# Any default versions that should be ignored.
IGNORED_VERSIONS = TEST_CONFIG["ignored_versions"]
Expand Down
2 changes: 1 addition & 1 deletion scripts/readme-gen/templates/install_deps.tmpl.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Install Dependencies
.. _Python Development Environment Setup Guide:
https://cloud.google.com/python/setup

#. Create a virtualenv. Samples are compatible with Python 2.7 and 3.4+.
#. Create a virtualenv. Samples are compatible with Python 3.6+.

.. code-block:: bash
Expand Down
4 changes: 2 additions & 2 deletions tests/data/scalars.jsonl
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
{"bool_col": true, "bytes_col": "abcd", "date_col": "2021-07-21", "datetime_col": "2021-07-21 11:39:45", "geography_col": "POINT(-122.0838511 37.3860517)", "int64_col": "123456789", "numeric_col": "1.23456789", "bignumeric_col": "10.111213141516171819", "float64_col": "1.25", "string_col": "Hello, World", "time_col": "11:41:43.07616", "timestamp_col": "2021-07-21T17:43:43.945289Z"}
{"bool_col": null, "bytes_col": null, "date_col": null, "datetime_col": null, "geography_col": null, "int64_col": null, "numeric_col": null, "bignumeric_col": null, "float64_col": null, "string_col": null, "time_col": null, "timestamp_col": null}
{"bool_col": true, "bytes_col": "SGVsbG8sIFdvcmxkIQ==", "date_col": "2021-07-21", "datetime_col": "2021-07-21 11:39:45", "geography_col": "POINT(-122.0838511 37.3860517)", "int64_col": "123456789", "interval_col": "P7Y11M9DT4H15M37.123456S", "numeric_col": "1.23456789", "bignumeric_col": "10.111213141516171819", "float64_col": "1.25", "rowindex": 0, "string_col": "Hello, World!", "time_col": "11:41:43.07616", "timestamp_col": "2021-07-21T17:43:43.945289Z"}
{"bool_col": null, "bytes_col": null, "date_col": null, "datetime_col": null, "geography_col": null, "int64_col": null, "interval_col": null, "numeric_col": null, "bignumeric_col": null, "float64_col": null, "rowindex": 1, "string_col": null, "time_col": null, "timestamp_col": null}
10 changes: 5 additions & 5 deletions tests/data/scalars_extreme.jsonl
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{"bool_col": true, "bytes_col": "DQo=\n", "date_col": "9999-12-31", "datetime_col": "9999-12-31 23:59:59.999999", "geography_col": "POINT(-135.0000 90.0000)", "int64_col": "9223372036854775807", "numeric_col": "9.9999999999999999999999999999999999999E+28", "bignumeric_col": "9.999999999999999999999999999999999999999999999999999999999999999999999999999E+37", "float64_col": "+inf", "string_col": "Hello, World", "time_col": "23:59:59.99999", "timestamp_col": "9999-12-31T23:59:59.999999Z"}
{"bool_col": false, "bytes_col": "8J+Zgw==\n", "date_col": "0001-01-01", "datetime_col": "0001-01-01 00:00:00", "geography_col": "POINT(45.0000 -90.0000)", "int64_col": "-9223372036854775808", "numeric_col": "-9.9999999999999999999999999999999999999E+28", "bignumeric_col": "-9.999999999999999999999999999999999999999999999999999999999999999999999999999E+37", "float64_col": "-inf", "string_col": "Hello, World", "time_col": "00:00:00", "timestamp_col": "0001-01-01T00:00:00.000000Z"}
{"bool_col": true, "bytes_col": "AA==\n", "date_col": "1900-01-01", "datetime_col": "1900-01-01 00:00:00", "geography_col": "POINT(-180.0000 0.0000)", "int64_col": "-1", "numeric_col": "0.000000001", "bignumeric_col": "-0.00000000000000000000000000000000000001", "float64_col": "nan", "string_col": "こんにちは", "time_col": "00:00:00.000001", "timestamp_col": "1900-01-01T00:00:00.000000Z"}
{"bool_col": false, "bytes_col": "", "date_col": "1970-01-01", "datetime_col": "1970-01-01 00:00:00", "geography_col": "POINT(0 0)", "int64_col": "0", "numeric_col": "0.0", "bignumeric_col": "0.0", "float64_col": 0.0, "string_col": "", "time_col": "12:00:00", "timestamp_col": "1970-01-01T00:00:00.000000Z"}
{"bool_col": null, "bytes_col": null, "date_col": null, "datetime_col": null, "geography_col": null, "int64_col": null, "numeric_col": null, "bignumeric_col": null, "float64_col": null, "string_col": null, "time_col": null, "timestamp_col": null}
{"bool_col": true, "bytes_col": "DQo=\n", "date_col": "9999-12-31", "datetime_col": "9999-12-31 23:59:59.999999", "geography_col": "POINT(-135.0000 90.0000)", "int64_col": "9223372036854775807", "interval_col": "P-10000Y0M-3660000DT-87840000H0M0S", "numeric_col": "9.9999999999999999999999999999999999999E+28", "bignumeric_col": "9.999999999999999999999999999999999999999999999999999999999999999999999999999E+37", "float64_col": "+inf", "rowindex": 0, "string_col": "Hello, World", "time_col": "23:59:59.999999", "timestamp_col": "9999-12-31T23:59:59.999999Z"}
{"bool_col": false, "bytes_col": "8J+Zgw==\n", "date_col": "0001-01-01", "datetime_col": "0001-01-01 00:00:00", "geography_col": "POINT(45.0000 -90.0000)", "int64_col": "-9223372036854775808", "interval_col": "P10000Y0M3660000DT87840000H0M0S", "numeric_col": "-9.9999999999999999999999999999999999999E+28", "bignumeric_col": "-9.999999999999999999999999999999999999999999999999999999999999999999999999999E+37", "float64_col": "-inf", "rowindex": 1, "string_col": "Hello, World", "time_col": "00:00:00", "timestamp_col": "0001-01-01T00:00:00.000000Z"}
{"bool_col": true, "bytes_col": "AA==\n", "date_col": "1900-01-01", "datetime_col": "1900-01-01 00:00:00", "geography_col": "POINT(-180.0000 0.0000)", "int64_col": "-1", "interval_col": "P0Y0M0DT0H0M0.000001S", "numeric_col": "0.000000001", "bignumeric_col": "-0.00000000000000000000000000000000000001", "float64_col": "nan", "rowindex": 2, "string_col": "こんにちは", "time_col": "00:00:00.000001", "timestamp_col": "1900-01-01T00:00:00.000000Z"}
{"bool_col": false, "bytes_col": "", "date_col": "1970-01-01", "datetime_col": "1970-01-01 00:00:00", "geography_col": "POINT(0 0)", "int64_col": "0", "interval_col": "P0Y0M0DT0H0M0S", "numeric_col": "0.0", "bignumeric_col": "0.0", "float64_col": 0.0, "rowindex": 3, "string_col": "", "time_col": "12:00:00", "timestamp_col": "1970-01-01T00:00:00.000000Z"}
{"bool_col": null, "bytes_col": null, "date_col": null, "datetime_col": null, "geography_col": null, "int64_col": null, "interval_col": null, "numeric_col": null, "bignumeric_col": null, "float64_col": null, "rowindex": 4, "string_col": null, "time_col": null, "timestamp_col": null}
54 changes: 32 additions & 22 deletions tests/data/scalars_schema.json
Original file line number Diff line number Diff line change
@@ -1,33 +1,33 @@
[
{
"mode": "NULLABLE",
"name": "timestamp_col",
"type": "TIMESTAMP"
"name": "bool_col",
"type": "BOOLEAN"
},
{
"mode": "NULLABLE",
"name": "time_col",
"type": "TIME"
"name": "bignumeric_col",
"type": "BIGNUMERIC"
},
{
"mode": "NULLABLE",
"name": "float64_col",
"type": "FLOAT"
"name": "bytes_col",
"type": "BYTES"
},
{
"mode": "NULLABLE",
"name": "datetime_col",
"type": "DATETIME"
"name": "date_col",
"type": "DATE"
},
{
"mode": "NULLABLE",
"name": "bignumeric_col",
"type": "BIGNUMERIC"
"name": "datetime_col",
"type": "DATETIME"
},
{
"mode": "NULLABLE",
"name": "numeric_col",
"type": "NUMERIC"
"name": "float64_col",
"type": "FLOAT"
},
{
"mode": "NULLABLE",
Expand All @@ -36,27 +36,37 @@
},
{
"mode": "NULLABLE",
"name": "date_col",
"type": "DATE"
"name": "int64_col",
"type": "INTEGER"
},
{
"mode": "NULLABLE",
"name": "string_col",
"type": "STRING"
"name": "interval_col",
"type": "INTERVAL"
},
{
"mode": "NULLABLE",
"name": "bool_col",
"type": "BOOLEAN"
"name": "numeric_col",
"type": "NUMERIC"
},
{
"mode": "REQUIRED",
"name": "rowindex",
"type": "INTEGER"
},
{
"mode": "NULLABLE",
"name": "bytes_col",
"type": "BYTES"
"name": "string_col",
"type": "STRING"
},
{
"mode": "NULLABLE",
"name": "int64_col",
"type": "INTEGER"
"name": "time_col",
"type": "TIME"
},
{
"mode": "NULLABLE",
"name": "timestamp_col",
"type": "TIMESTAMP"
}
]
35 changes: 29 additions & 6 deletions tests/system/test_arrow.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,14 @@

"""System tests for Arrow connector."""

from typing import Optional

import pyarrow
import pytest

from google.cloud import bigquery
from google.cloud.bigquery import enums


@pytest.mark.parametrize(
("max_results", "scalars_table_name"),
Expand All @@ -28,17 +33,35 @@
),
)
def test_list_rows_nullable_scalars_dtypes(
bigquery_client,
scalars_table,
scalars_extreme_table,
max_results,
scalars_table_name,
bigquery_client: bigquery.Client,
scalars_table: str,
scalars_extreme_table: str,
max_results: Optional[int],
scalars_table_name: str,
):
table_id = scalars_table
if scalars_table_name == "scalars_extreme_table":
table_id = scalars_extreme_table

# TODO(GH#836): Avoid INTERVAL columns until they are supported by the
# BigQuery Storage API and pyarrow.
schema = [
bigquery.SchemaField("bool_col", enums.SqlTypeNames.BOOLEAN),
bigquery.SchemaField("bignumeric_col", enums.SqlTypeNames.BIGNUMERIC),
bigquery.SchemaField("bytes_col", enums.SqlTypeNames.BYTES),
bigquery.SchemaField("date_col", enums.SqlTypeNames.DATE),
bigquery.SchemaField("datetime_col", enums.SqlTypeNames.DATETIME),
bigquery.SchemaField("float64_col", enums.SqlTypeNames.FLOAT64),
bigquery.SchemaField("geography_col", enums.SqlTypeNames.GEOGRAPHY),
bigquery.SchemaField("int64_col", enums.SqlTypeNames.INT64),
bigquery.SchemaField("numeric_col", enums.SqlTypeNames.NUMERIC),
bigquery.SchemaField("string_col", enums.SqlTypeNames.STRING),
bigquery.SchemaField("time_col", enums.SqlTypeNames.TIME),
bigquery.SchemaField("timestamp_col", enums.SqlTypeNames.TIMESTAMP),
]

arrow_table = bigquery_client.list_rows(
table_id, max_results=max_results,
table_id, max_results=max_results, selected_fields=schema,
).to_arrow()

schema = arrow_table.schema
Expand Down
53 changes: 5 additions & 48 deletions tests/system/test_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -1962,6 +1962,11 @@ def test_query_w_query_params(self):
"expected": {"friends": [phred_name, bharney_name]},
"query_parameters": [with_friends_param],
},
{
"sql": "SELECT @bignum_param",
"expected": bignum,
"query_parameters": [bignum_param],
},
]

for example in examples:
Expand Down Expand Up @@ -2406,54 +2411,6 @@ def test_nested_table_to_arrow(self):
self.assertTrue(pyarrow.types.is_list(record_col[1].type))
self.assertTrue(pyarrow.types.is_int64(record_col[1].type.value_type))

def test_list_rows_empty_table(self):
from google.cloud.bigquery.table import RowIterator

dataset_id = _make_dataset_id("empty_table")
dataset = self.temp_dataset(dataset_id)
table_ref = dataset.table("empty_table")
table = Config.CLIENT.create_table(bigquery.Table(table_ref))

# It's a bit silly to list rows for an empty table, but this does
# happen as the result of a DDL query from an IPython magic command.
rows = Config.CLIENT.list_rows(table)
self.assertIsInstance(rows, RowIterator)
self.assertEqual(tuple(rows), ())

def test_list_rows_page_size(self):
from google.cloud.bigquery.job import SourceFormat
from google.cloud.bigquery.job import WriteDisposition

num_items = 7
page_size = 3
num_pages, num_last_page = divmod(num_items, page_size)

SF = bigquery.SchemaField
schema = [SF("string_col", "STRING", mode="NULLABLE")]
to_insert = [{"string_col": "item%d" % i} for i in range(num_items)]
rows = [json.dumps(row) for row in to_insert]
body = io.BytesIO("{}\n".format("\n".join(rows)).encode("ascii"))

table_id = "test_table"
dataset = self.temp_dataset(_make_dataset_id("nested_df"))
table = dataset.table(table_id)
self.to_delete.insert(0, table)
job_config = bigquery.LoadJobConfig()
job_config.write_disposition = WriteDisposition.WRITE_TRUNCATE
job_config.source_format = SourceFormat.NEWLINE_DELIMITED_JSON
job_config.schema = schema
# Load a table using a local JSON file from memory.
Config.CLIENT.load_table_from_file(body, table, job_config=job_config).result()

df = Config.CLIENT.list_rows(table, selected_fields=schema, page_size=page_size)
pages = df.pages

for i in range(num_pages):
page = next(pages)
self.assertEqual(page.num_items, page_size)
page = next(pages)
self.assertEqual(page.num_items, num_last_page)

def temp_dataset(self, dataset_id, location=None):
project = Config.CLIENT.project
dataset_ref = bigquery.DatasetReference(project, dataset_id)
Expand Down
Loading

0 comments on commit 60e73fe

Please sign in to comment.