-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Closed
Labels
api: datastoreIssues related to the Datastore API.Issues related to the Datastore API.
Description
Current plan of attack:
- Remove
--plugin=protoc-gen-grpc=$(GRPC_PLUGIN)from "helper" modules that don't use gRPC (e.g.google.bigtable.v1.bigtable_data) (this way sub-packages that don't need gRPC can use them) (Updating Makefile to only use gRPC when needed. #1354) - Ditch shared generated modules for those included with
protobufandgoogleapis-common-protos(Use googleapis common protos #1353)Move shared (i.e. non-Bigtable specific) modules fromgcloud.bigtable._generatedtogcloud._generated(this way all sub-packages can have access to shared protobuf message classes) and then subsequently update thebigtableimports to accommodate this change - Pull out
_get_pb_property_valueingcloud.bigtable.clusterinto core (gcloud._helpers) and separate out a_has_fieldpiece sincepb_message.HasFielddoesn't work on non-message fields inproto3(Moving _get_pb_property_value from bigtable into core. #1329) - Update
Makefileto reflect changes above (i.e. make sure it can be run without changing the code) (See Bringing auto-gen import re-writing up to date with files. #1316) - Update
Makefileto automatically add the_foo.protofiles to the repo (See Making pb2 auto-gen script also copy over .proto files. #1317) - Fold
Batch.add_auto_id_entityintoBatch.put(theauto_id_entitymutation is removed in favor of just usinginsertwith a partial key) (see Removing datastore Batch.add_auto_id_entity. #1296) - Loosen
Batchimplementation's reliance onMutationmessage class (it is used in aLookupRequestand their combined structure changes inv1beta3) (See Renaming datastore Batch.mutation to mutations. #1306, Puting helpers in datastore Batch for getting new mutations. #1319, Using protobuf CommitRequest in datastore Connection.commit. #1341) - Remove use of
serializableinClient.transactionandTransaction(it is no longer an option inv1beta3, so we will curtail it's use inv1beta2before the switch) (See Removing serializable option from datastore Transaction. #1294) - Manually partition the
_datastore_v1_pb2.pymodule into 3 modules that just import portions of the namespace. These modules (datastore_pb2.py,entity_pb2.py,query_pb2.py) will be according to the membership in the new proto definitions forv1beta3. (See Replacing datastore pb uses with entity shim. #1297, Replacing datastore pb uses with query shim. #1299, Replacing datastore pb uses with "datastore" shim. #1301, Creating _generated pb directory for datastore #1328) - Switch over imports from directly using
_datastore_v1_pb2.pyto using our "shim" imports to mock the structure ofv1beta3. (See Replacing datastore pb uses with entity shim. #1297, Replacing datastore pb uses with query shim. #1299, Replacing datastore pb uses with "datastore" shim. #1301, Creating _generated pb directory for datastore #1328) - Make sure
API_BASE_URLfor theConnectionclass is used (rather than a parent's version) (See Explicitly using API_BASE_URL from current connection in datastore. #1293) - Change
Client.dataset_idtoClient.project(in advance of the rename toPartitionId.project_idinv1beta3) (Replace dataset id with project in datastore #1330) - Change the return type of
Connection.committo be a tuple ofindex_updatesandmutation_results(they were previously on the same result object but are being split apart inv1beta3) (See Reducing datastore commit reliance on structure of response. #1314) - Replace uses of
HasFieldfor non-message values with_has_field(see above) (Moving _get_pb_property_value from bigtable into core. #1329) - Remove
v1beta2generatedpb2file and old.protodefinition (can also remove_datastore_v1_pb2.pyfrom thepylintrc_defaultignored file) and delete our "shim" modules (Upgrading Makefile to generate datastore v1beta3. #1355, Upgrading Makefile to generate datastore v1beta3. #1428) - Update
Makefileto incorporate protobuf definitions forv1beta3(I did this in a side-project, Use googleapis common protos #1353, Updating Makefile to only use gRPC when needed. #1354, Upgrading Makefile to generate datastore v1beta3. #1355, Upgrading Makefile to generate datastore v1beta3. #1428) -
Rewrite our "shim" imports to use the actualdatastore._generatedfiles forv1beta3. - Switch
API_BASE_URLfromhttps://www.googleapis.comtohttps://datastore.googleapis.comand drop'https://www.googleapis.com/auth/userinfo.emailfrom the scope list (Updating datastore URI template for v1beta3. #1339, Updating datastore URI template for v1beta3. #1406) - Accommodate renames / retyped / removed
-
CommitRequest.mutation --> CommitRequest.mutations(Updating CommitRequest, Mutation and helpers for v1beta3. #1461) -
LookupRequest.key --> LookupRequest.keys(Handling datastore renames key -> keys #1358, Handling datastore renames key -> keys #1456) -
AllocateIdsResponse.key --> AllocateIdsResponse.keys(Handling datastore renames key -> keys #1358, Handling datastore renames key -> keys #1456) -
AllocateIdsRequest.key --> AllocateIdsRequest.keys(Handling datastore renames key -> keys #1358, Handling datastore renames key -> keys #1456) -
Key.path_element --> Key.path(Renaming path_element->path in Key. #1360, Renaming path_element->path in Key. #1457) -
Entity.property --> Entity.properties(Upgrading Entity.property to properties map in datastore. #1458) -
Query.group_by --> Query.distinct_on(Handling datastore renames on Query and QueryResultBatch. #1357, Handling datastore renames on Query and QueryResultBatch. #1455) -
Query.limit --> Query.limit.value(Handling datastore renames on Query and QueryResultBatch. #1357, Handling datastore renames on Query and QueryResultBatch. #1455) -
QueryResultBatch.entity_result --> QueryResultBatch.entity_results(Handling datastore renames on Query and QueryResultBatch. #1357, Handling datastore renames on Query and QueryResultBatch. #1455) -
PartitionId.namespace --> PartitionId.namespace_id(Handle datastore renames on PartitionId #1359, Handle datastore renames on PartitionId #1452) -
PartitionId.dataset_id --> PartitionId.project_id(Handle datastore renames on PartitionId #1359, Handle datastore renames on PartitionId #1452) -
Value.indexed --> Value.exclude_from_indexes(Rename Value.indexed->exclude_from_indexes. #1365, Rename Value.indexed->exclude_from_indexes. #1453) -
Value.list_value --> Value.array_value(Upgrading list_value -> array_value for v1beta3. #1460) -
Value.null_valueadded (Adding support for null and geo point values in v1beta3. #1464) -
Value.timestamp_microseconds_value --> Value.timestamp_value(with type change) (Moving _pb_timestamp_to_datetime into core. #1361, Upgrading timestamp_microseconds_value to timestamp_value. #1459) -
Value.geo_point_valueadded (Adding support for null and geo point values in v1beta3. #1464) -
Value.blob_key_valueremoved -
CompositeFilter.filter --> CompositeFilter.filters(Handling datastore renames on CompositeFilter and PropertyFilter. #1356, Handling datastore renames on CompositeFilter and PropertyFilter. #1454) -
CompositeFilter.operation --> CompositeFilter.op(Handling datastore renames on CompositeFilter and PropertyFilter. #1356, Handling datastore renames on CompositeFilter and PropertyFilter. #1454) -
PropertyFilter.operation --> PropertyFilter.op(Handling datastore renames on CompositeFilter and PropertyFilter. #1356, Handling datastore renames on CompositeFilter and PropertyFilter. #1454) -
CompositeFilter.AND --> CompositeFilter.OPERATOR_UNSPECIFIED(as default value) (Handling datastore renames on CompositeFilter and PropertyFilter. #1356, Handling datastore renames on CompositeFilter and PropertyFilter. #1454) -
ReadOptions.DEFAULT --> ReadOptions.READ_CONSISTENCY_UNSPECIFIED(as default value) (Handling datastore renames on CompositeFilter and PropertyFilter. #1356, Handling datastore renames on CompositeFilter and PropertyFilter. #1454)
-
- Deal with the fact that
Entity.propertyis not amapcalledEntity.properties(somewhat different semantics) (Adding helpers for interacting with properties in Entity protobuf. #1340) - Tear out dataset (project) prefix code (Removing hacks that avoid using project ID in key protos. #1466)
- Re-vamp environment variable usage (
dataset_idno longer needed?) (Removing custom dataset ID environment variable. #1465) - Remove use of
isolation_levelinConnection.begin_transaction(Removing use of isolation level in datastore. #1343, Removing use of isolation level in datastore. #1407)
New Features:
- Add
GqlQuerysupport (Missing GQL support #304) -
QueryResultBatch.skipped_cursoradded (only set whenskipped_results != 0) -
EntityResult.cursoradded (set by the backend when the entity result is part of aRunQueryResponse.batch.entity_resultsresponse) - Parse protobuf errors
Metadata
Metadata
Assignees
Labels
api: datastoreIssues related to the Datastore API.Issues related to the Datastore API.