Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Storing embedded entity, it's ignoring exclude_from_indexes from embedded #1206

Closed
igama opened this issue Oct 28, 2015 · 11 comments
Closed
Assignees
Labels
api: datastore Issues related to the Datastore API.

Comments

@igama
Copy link

igama commented Oct 28, 2015

I'm trying to have a embedded entity with a field that is larger than 1500bytes. I add that field to the exclude_from_indexes key of the embedded entity. When I try and save the parent Entity it tells me the field is bigger than 1500bytes. If I save the embedded entity independently, it works.

Is exclude_from_indexes ignored on embedded entity?

client = datastore.Client(dataset_id=projectID)
record_key = client.key('Record', my_id)
record_entity = datastore.Entity(record_key)

embedded_key = client.key('Data', another_id)
embedded_entity = datastore.Entity(key=embedded_key,exclude_from_indexes=('big_field',))
embedded_entity['field1']='1234'
embedded_entity['big_field']='large string bigger than 1500bytes'

record_entity['RandomFieldName']=embedded_entity

client.put(record_entity)
#Error: gcloud.exceptions.BadRequest: 400 The value of property "big_field" is longer than 1500 bytes.

client.put(embedded_entity)
#No Error

@dhermes
Copy link
Contributor

dhermes commented Oct 28, 2015

@pcostell What's the expected behavior here?

@pcostell
Copy link
Contributor

This should succeed. I am able to get this to work successfully directly from the API (in a commit):

{
 "mode": "NON_TRANSACTIONAL",
 "mutation": {
  "insertAutoId": [
   {
    "properties": {
     "embedded": {
      "entityValue": {
       "properties": {
        "long_prop": {
         "stringValue": "a string longer than 1500 bytes", # I just used 'a'*1501
         "indexed": false
        }
       }
      }
     }
    },
    "key": {
     "path": [
      {
       "kind": "ModelWithEmbedded"
      }
     ]
    }
   }
  ]
 }
}

Gives me a 200.

@pcostell
Copy link
Contributor

It looks like information about exclude_from_indexes is not propagated in embedded entities: https://github.com/GoogleCloudPlatform/gcloud-python/blob/c8d231ba7462e2f35d689b526a70cef4712c565a/gcloud/datastore/batch.py#L282

@dhermes
Copy link
Contributor

dhermes commented Oct 28, 2015

Thanks @pcostell!

@igama
Copy link
Author

igama commented Oct 29, 2015

@pcostell thanks for debugging this.

@dhermes dhermes added the api: datastore Issues related to the Datastore API. label Oct 29, 2015
dhermes added a commit to dhermes/google-cloud-python that referenced this issue Nov 24, 2015
To do this, added entity_to_protobuf method that could be used
recursively. Also solves googleapis#1206 since recursively serializing
nested entities to protobuf was being done incorrectly.

Fixes googleapis#1065. Fixes googleapis#1206.
dhermes added a commit to dhermes/google-cloud-python that referenced this issue Dec 17, 2015
To do this, added entity_to_protobuf method that could be used
recursively. Also solves googleapis#1206 since recursively serializing
nested entities to protobuf was being done incorrectly.

Fixes googleapis#1065. Fixes googleapis#1206.
dhermes added a commit to dhermes/google-cloud-python that referenced this issue Dec 19, 2015
To do this, added entity_to_protobuf method that could be used
recursively. Also solves googleapis#1206 since recursively serializing
nested entities to protobuf was being done incorrectly.

Fixes googleapis#1065. Fixes googleapis#1206.
@dhermes
Copy link
Contributor

dhermes commented Dec 21, 2015

@igama This has been fixed:

>>> from gcloud import datastore
>>> from gcloud.environment_vars import TESTS_DATASET
>>> from gcloud.datastore import client
>>> client.DATASET = TESTS_DATASET
>>> CLIENT = datastore.Client()
>>> record_key = CLIENT.key('Record', 1234)
>>> record_entity = datastore.Entity(record_key)
>>> embedded_key = CLIENT.key('Data', 5678)
>>> embedded_entity = datastore.Entity(key=embedded_key, exclude_from_indexes=('big_field',))
>>> embedded_entity['field1'] = '1234'
>>> embedded_entity['big_field'] = 'large string bigger than 1500bytes ' * 50
>>> len(embedded_entity['big_field'])
1750
>>> record_entity['RandomFieldName'] = embedded_entity
>>> CLIENT.put(record_entity)
>>> CLIENT.put(embedded_entity)
>>> 
>>> CLIENT.delete(record_entity.key)
>>> CLIENT.delete(embedded_entity.key)

@pcostell
Copy link
Contributor

Note -- This should probably use . to qualify the name (it matches a proposed field mask design):

exclude_from_indexes=('RandomFieldName.big_field')

Otherwise, there is no way to exclude nested field vs non-nested fields.

@dhermes
Copy link
Contributor

dhermes commented Dec 21, 2015

@pcostell I'm not sure what you mean RE: "there is no way" --- see the above. The way is to exclude them on the nested Entity and then we propagate that into the protobuf created. Maybe you mean it would be useful to differentiate between properties in queries?

As for the name used, see the reported snippet at the top of this issue.

@pcostell
Copy link
Contributor

Ah sorry definitely read recorded_entity instead of embedded_entity at embedded creation. Please ignore my comment :-)

@dhermes
Copy link
Contributor

dhermes commented Dec 21, 2015

No worries. Thanks for taking the time to read the notification

@igama
Copy link
Author

igama commented Dec 30, 2015

@dhermes thank you, Will try it in the next days.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: datastore Issues related to the Datastore API.
Projects
None yet
Development

No branches or pull requests

3 participants