Skip to content

Commit

Permalink
Merge branch 'yanhuil/trace_id' of https://github.com/GoogleCloudPlat…
Browse files Browse the repository at this point in the history
…form/google-cloud-python into yanhuil/trace_id
  • Loading branch information
liyanhui1228 committed Jun 2, 2017
2 parents 14c2545 + 2889cfc commit 53738d7
Show file tree
Hide file tree
Showing 154 changed files with 687 additions and 526 deletions.
2 changes: 1 addition & 1 deletion README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ Check out the `Authentication section`_ in our documentation to learn more.
You may also find the `authentication document`_ shared by all the
``google-cloud-*`` libraries to be helpful.

.. _Authentication section: https://google-cloud-python.readthedocs.io/en/latest/google-cloud-auth.html
.. _Authentication section: https://google-cloud-python.readthedocs.io/en/latest/core/auth.html
.. _authentication document: https://github.com/GoogleCloudPlatform/gcloud-common/tree/master/authentication

Contributing
Expand Down
14 changes: 14 additions & 0 deletions bigquery/google/cloud/bigquery/query.py
Original file line number Diff line number Diff line change
Expand Up @@ -226,6 +226,20 @@ def total_bytes_processed(self):
if total_bytes_processed is not None:
return int(total_bytes_processed)

@property
def num_dml_affected_rows(self):
"""Total number of rows affected by a DML query.
See:
https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/query#numDmlAffectedRows
:rtype: int, or ``NoneType``
:returns: Count generated on the server (None until set by the server).
"""
num_dml_affected_rows = self._properties.get('numDmlAffectedRows')
if num_dml_affected_rows is not None:
return int(num_dml_affected_rows)

@property
def rows(self):
"""Query results.
Expand Down
30 changes: 30 additions & 0 deletions bigquery/tests/unit/test_query.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,7 @@ def _makeResource(self, complete=False):
]
resource['pageToken'] = self.TOKEN
resource['totalBytesProcessed'] = 100000
resource['numDmlAffectedRows'] = 123
resource['cacheHit'] = False

return resource
Expand Down Expand Up @@ -124,10 +125,12 @@ def _verifyResourceProperties(self, query, resource):
self.assertEqual(query.complete, resource.get('jobComplete'))
self.assertEqual(query.errors, resource.get('errors'))
self.assertEqual(query.page_token, resource.get('pageToken'))

if 'totalRows' in resource:
self.assertEqual(query.total_rows, int(resource['totalRows']))
else:
self.assertIsNone(query.total_rows)

if 'totalBytesProcessed' in resource:
self.assertEqual(query.total_bytes_processed,
int(resource['totalBytesProcessed']))
Expand All @@ -139,6 +142,12 @@ def _verifyResourceProperties(self, query, resource):
else:
self.assertIsNone(query.name)

if 'numDmlAffectedRows' in resource:
self.assertEqual(query.num_dml_affected_rows,
int(resource['numDmlAffectedRows']))
else:
self.assertIsNone(query.num_dml_affected_rows)

self._verify_udf_resources(query, resource)
self._verifyQueryParameters(query, resource)
self._verifySchema(query, resource)
Expand Down Expand Up @@ -371,6 +380,27 @@ def test_total_bytes_processed_present_string(self):
query._set_properties(resource)
self.assertEqual(query.total_bytes_processed, TOTAL_BYTES_PROCESSED)

def test_num_dml_affected_rows_missing(self):
client = _Client(self.PROJECT)
query = self._make_one(self.QUERY, client)
self.assertIsNone(query.num_dml_affected_rows)

def test_num_dml_affected_rows_present_integer(self):
DML_AFFECTED_ROWS = 123456
client = _Client(self.PROJECT)
query = self._make_one(self.QUERY, client)
resource = {'numDmlAffectedRows': DML_AFFECTED_ROWS}
query._set_properties(resource)
self.assertEqual(query.num_dml_affected_rows, DML_AFFECTED_ROWS)

def test_num_dml_affected_rows_present_string(self):
DML_AFFECTED_ROWS = 123456
client = _Client(self.PROJECT)
query = self._make_one(self.QUERY, client)
resource = {'numDmlAffectedRows': str(DML_AFFECTED_ROWS)}
query._set_properties(resource)
self.assertEqual(query.num_dml_affected_rows, DML_AFFECTED_ROWS)

def test_schema(self):
client = _Client(self.PROJECT)
query = self._make_one(self.QUERY, client)
Expand Down
4 changes: 2 additions & 2 deletions docs/bigquery-client.rst → docs/bigquery/client.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
BigQuery Client
===============
Client
======

.. automodule:: google.cloud.bigquery.client
:members:
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
57 changes: 34 additions & 23 deletions docs/bigquery-usage.rst → docs/bigquery/usage.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,16 @@
Using the API
=============
BigQuery
========

.. toctree::
:maxdepth: 2
:hidden:

client
dataset
job
query
schema
table

Authentication / Configuration
------------------------------
Expand Down Expand Up @@ -71,31 +82,31 @@ Dataset operations

List datasets for the client's project:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START client_list_datasets]
:end-before: [END client_list_datasets]

Create a new dataset for the client's project:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START dataset_create]
:end-before: [END dataset_create]

Check for the existence of a dataset:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START dataset_exists]
:end-before: [END dataset_exists]

Refresh metadata for a dataset (to pick up changes made by another client):

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START dataset_reload]
:end-before: [END dataset_reload]

Patch metadata for a dataset:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START dataset_patch]
:end-before: [END dataset_patch]

Expand All @@ -114,7 +125,7 @@ Replace the ACL for a dataset, and update all writeable fields:
Delete a dataset:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START dataset_delete]
:end-before: [END dataset_delete]

Expand All @@ -124,61 +135,61 @@ Tables

Tables exist within datasets. List tables for the dataset:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START dataset_list_tables]
:end-before: [END dataset_list_tables]

Create a table:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START table_create]
:end-before: [END table_create]

Check for the existence of a table:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START table_exists]
:end-before: [END table_exists]

Refresh metadata for a table (to pick up changes made by another client):

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START table_reload]
:end-before: [END table_reload]

Patch specific properties for a table:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START table_patch]
:end-before: [END table_patch]

Update all writable metadata for a table

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START table_update]
:end-before: [END table_update]

Get rows from a table's data:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START table_fetch_data]
:end-before: [END table_fetch_data]

Insert rows into a table's data:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START table_insert_data]
:end-before: [END table_insert_data]

Upload table data from a file:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START table_upload_from_file]
:end-before: [END table_upload_from_file]

Delete a table:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START table_delete]
:end-before: [END table_delete]

Expand All @@ -195,7 +206,7 @@ Jobs describe actions peformed on data in BigQuery tables:

List jobs for a project:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START client_list_jobs]
:end-before: [END client_list_jobs]

Expand All @@ -205,29 +216,29 @@ Querying data (synchronous)

Run a query which can be expected to complete within bounded time:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START client_run_sync_query]
:end-before: [END client_run_sync_query]

Run a query using a named query parameter:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START client_run_sync_query_w_param]
:end-before: [END client_run_sync_query_w_param]

If the rows returned by the query do not fit into the initial response,
then we need to fetch the remaining rows via
:meth:`~google.cloud.bigquery.query.QueryResults.fetch_data`:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START client_run_sync_query_paged]
:end-before: [END client_run_sync_query_paged]

If the query takes longer than the timeout allowed, ``query.complete``
will be ``False``. In that case, we need to poll the associated job until
it is done, and then fetch the results:

.. literalinclude:: bigquery_snippets.py
.. literalinclude:: snippets.py
:start-after: [START client_run_sync_query_timeout]
:end-before: [END client_run_sync_query_timeout]

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Configuration
-------------

- For an overview of authentication in ``google-cloud-python``,
see :doc:`google-cloud-auth`.
see :doc:`/core/auth`.

- In addition to any authentication configuration, you can also set the
:envvar:`GOOGLE_CLOUD_PROJECT` environment variable for the Google Cloud Console
Expand Down Expand Up @@ -84,7 +84,7 @@ After a :class:`Client <google.cloud.bigtable.client.Client>`, the next highest-
object is an :class:`Instance <google.cloud.bigtable.instance.Instance>`. You'll need
one before you can interact with tables or data.

Head next to learn about the :doc:`bigtable-instance-api`.
Head next to learn about the :doc:`instance-api`.

.. _Instance Admin: https://github.com/GoogleCloudPlatform/cloud-bigtable-client/tree/master/bigtable-protos/src/main/proto/google/bigtable/admin/instance/v1
.. _Table Admin: https://github.com/GoogleCloudPlatform/cloud-bigtable-client/tree/master/bigtable-protos/src/main/proto/google/bigtable/admin/table/v1
File renamed without changes.
File renamed without changes.
File renamed without changes.
2 changes: 1 addition & 1 deletion docs/bigtable-data-api.rst → docs/bigtable/data-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ column families, you are ready to store and retrieve data.
Cells vs. Columns vs. Column Families
+++++++++++++++++++++++++++++++++++++

* As explained in the :doc:`table overview <bigtable-table-api>`, tables can
* As explained in the :doc:`table overview <table-api>`, tables can
have many column families.
* As described below, a table can also have many rows which are
specified by row keys.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ Now we go down the hierarchy from
:class:`Instance <google.cloud.bigtable.instance.Instance>` to a
:class:`Table <google.cloud.bigtable.table.Table>`.

Head next to learn about the :doc:`bigtable-table-api`.
Head next to learn about the :doc:`table-api`.

.. _Instance Admin API: https://cloud.google.com/bigtable/docs/creating-instance
.. _CreateInstance: https://github.com/GoogleCloudPlatform/cloud-bigtable-client/blob/2aae624081f652427052fb652d3ae43d8ac5bf5a/bigtable-protos/src/main/proto/google/bigtable/admin/instance/v1/bigtable_instance_service.proto#L66-L68
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
4 changes: 2 additions & 2 deletions docs/bigtable-table-api.rst → docs/bigtable/table-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ or similar):
This rule helps the backend determine when and how to clean up old cells
in the column family.

See :doc:`bigtable-column-family` for more information about
See :doc:`column-family` for more information about
:class:`GarbageCollectionRule <google.cloud.bigtable.column_family.GarbageCollectionRule>`
and related classes.

Expand Down Expand Up @@ -141,7 +141,7 @@ Now we go down the final step of the hierarchy from
:class:`Row <google.cloud.bigtable.row.Row>` as well as streaming
data directly via a :class:`Table <google.cloud.bigtable.table.Table>`.

Head next to learn about the :doc:`bigtable-data-api`.
Head next to learn about the :doc:`data-api`.

.. _ListTables: https://github.com/GoogleCloudPlatform/cloud-bigtable-client/blob/2aae624081f652427052fb652d3ae43d8ac5bf5a/bigtable-protos/src/main/proto/google/bigtable/admin/table/v1/bigtable_table_service.proto#L40-L42
.. _CreateTable: https://github.com/GoogleCloudPlatform/cloud-bigtable-client/blob/2aae624081f652427052fb652d3ae43d8ac5bf5a/bigtable-protos/src/main/proto/google/bigtable/admin/table/v1/bigtable_table_service.proto#L35-L37
Expand Down
File renamed without changes.
23 changes: 20 additions & 3 deletions docs/bigtable-usage.rst → docs/bigtable/usage.rst
Original file line number Diff line number Diff line change
@@ -1,13 +1,30 @@
Using the API
=============
Bigtable
========

.. toctree::
:maxdepth: 2
:hidden:

client-intro
client
cluster
instance
instance-api
table
table-api
column-family
row
row-data
row-filters
data-api

API requests are sent to the `Google Cloud Bigtable`_ API via RPC over HTTP/2.
In order to support this, we'll rely on `gRPC`_. We are working with the gRPC
team to rapidly make the install story more user-friendly.

Get started by learning about the
:class:`Client <google.cloud.bigtable.client.Client>` on the
:doc:`bigtable-client-intro` page.
:doc:`client-intro` page.

In the hierarchy of API concepts

Expand Down
11 changes: 8 additions & 3 deletions docs/google-cloud-auth.rst → docs/core/auth.rst
Original file line number Diff line number Diff line change
Expand Up @@ -87,14 +87,19 @@ However, you may want to be explicit because
from different projects

In these situations, you can create an explicit
:class:`~google.auth.credentials.Credentials` object suited to your
environment. After creation, you can pass it directly to a
:class:`Client <google.cloud.client.Client>`:
:class:`~google.auth.credentials.Credentials` object suited to your environment.
After creation, you can pass it directly to a :class:`Client <google.cloud.client.Client>`:

.. code:: python
client = Client(credentials=credentials)
.. tip::
To create a credentials object, follow the `google-auth-guide`_.

.. _google-auth-guide: https://google-auth.readthedocs.io/en/latest/user-guide.html#service-account-private-key-files


Google App Engine Environment
-----------------------------

Expand Down
2 changes: 1 addition & 1 deletion docs/google-cloud-config.rst → docs/core/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ Authentication
==============

The authentication credentials can be implicitly determined from the
environment or directly. See :doc:`google-cloud-auth`.
environment or directly. See :doc:`/core/auth`.

Logging in via ``gcloud beta auth application-default login`` will
automatically configure a JSON key file with your default project ID and
Expand Down
Loading

0 comments on commit 53738d7

Please sign in to comment.