Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(bigquery): write pandas datetime[ns] columns to BigQuery TIMESTAMP columns #10028

Conversation

tswast
Copy link
Contributor

@tswast tswast commented Dec 30, 2019

Also:

  • Enable TIMESTAMP and DATETIME unit tests for _pandas_helpers.
  • Add more data types to load dataframe sample.

Fixes #9996 🦕

@googlebot googlebot added the cla: yes This human has signed the Contributor License Agreement. label Dec 30, 2019
@tswast tswast changed the title doc(bigquery): add more data types to load dataframe sample fix(bigquery): interpret datetime columns from pandas dataframe as nanoseconds Jan 2, 2020
@tswast tswast force-pushed the issue9996-load_table_from_dataframe-sample-tests branch from e10f755 to ee48df8 Compare January 2, 2020 20:32
…P columns

Also:

* Enable TIMESTAMP and DATETIME unit tests for `_pandas_helpers`.
* Add more data types to load dataframe sample.
@tswast tswast force-pushed the issue9996-load_table_from_dataframe-sample-tests branch from ee48df8 to 6965558 Compare January 2, 2020 23:13
@tswast tswast changed the title fix(bigquery): interpret datetime columns from pandas dataframe as nanoseconds fix(bigquery): write pandas datetime[ns] columns to BigQuery TIMESTAMP columns Jan 2, 2020
@tswast tswast marked this pull request as ready for review January 2, 2020 23:18
@tswast tswast requested review from a team and plamut January 2, 2020 23:18
Pandas doesn't automatically convert datetime objects to UTC time, so
show how to do this in the code sample.
@tswast
Copy link
Contributor Author

tswast commented Jan 3, 2020

Failure is for the docs session and appears to be just a temporary flake with the gax docs.

sphinx.errors.SphinxWarning: failed to reach any of the inventories with the following issues:
intersphinx inventory 'https://gax-python.readthedocs.org/en/latest/objects.inv' not fetchable due to <class 'requests.exceptions.ConnectionError'>: HTTPSConnectionPool(host='gax-python.readthedocs.org', port=443): Max retries exceeded with url: /en/latest/objects.inv (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7fdef467d050>: Failed to establish a new connection: [Errno 110] Connection timed out'))

@plamut plamut added the kokoro:force-run Add this label to force Kokoro to re-run the tests. label Jan 9, 2020
@yoshi-kokoro yoshi-kokoro removed the kokoro:force-run Add this label to force Kokoro to re-run the tests. label Jan 9, 2020
Copy link
Contributor

@plamut plamut left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to fix an undefined name that causes the snippets and lint checks to fail. Seems good otherwise.

@tswast tswast requested a review from plamut January 10, 2020 15:38
Copy link
Contributor

@plamut plamut left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla: yes This human has signed the Contributor License Agreement.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

BigQuery: DATETIME columns invalid when uploaded with load_table_from_dataframe
4 participants