Skip to content

Commit

Permalink
Merge pull request #560 from dhermes/fix-544
Browse files Browse the repository at this point in the history
Changing Key noun to be Blob in storage package.
  • Loading branch information
dhermes committed Jan 27, 2015
2 parents 86acc4a + 4570c43 commit 3e89521
Show file tree
Hide file tree
Showing 21 changed files with 828 additions and 824 deletions.
6 changes: 3 additions & 3 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -95,9 +95,9 @@ to Cloud Storage using this Client Library.
import gcloud.storage
bucket = gcloud.storage.get_bucket('bucket-id-here', 'project-id')
# Then do other things...
key = bucket.get_key('/remote/path/to/file.txt')
print key.get_contents_as_string()
key.set_contents_from_string('New contents!')
blob = bucket.get_blob('/remote/path/to/file.txt')
print blob.get_contents_as_string()
blob.set_contents_from_string('New contents!')
bucket.upload_file('/remote/path/storage.txt', '/local/path.txt')
Contributing
Expand Down
67 changes: 33 additions & 34 deletions docs/_components/storage-getting-started.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Getting started with Cloud Storage
This tutorial focuses on using ``gcloud`` to access
Google Cloud Storage.
We'll go through the basic concepts,
how to operate on buckets and keys,
how to operate on buckets and blobs,
and how to handle access control,
among other things.

Expand Down Expand Up @@ -114,32 +114,31 @@ so if you want to group data into "directories",
you can do that.

The fundamental container for a file in Cloud Storage
is called an Object,
however ``gcloud`` uses the term ``Key``
to avoid confusion between ``object`` and ``Object``.
is called an Object, however ``gcloud`` uses the term ``Blob``
to avoid confusion with the Python built-in ``object``.

If you want to set some data,
you just create a ``Key`` inside your bucket
and store your data inside the key::
you just create a ``Blob`` inside your bucket
and store your data inside the blob::

>>> key = bucket.new_key('greeting.txt')
>>> key.set_contents_from_string('Hello world!')
>>> blob = bucket.new_blob('greeting.txt')
>>> blob.set_contents_from_string('Hello world!')

:func:`new_key <gcloud.storage.bucket.Bucket.new_key>`
creates a :class:`Key <gcloud.storage.key.Key>` object locally
:func:`new_blob <gcloud.storage.bucket.Bucket.new_blob>`
creates a :class:`Blob <gcloud.storage.blob.Blob>` object locally
and
:func:`set_contents_from_string <gcloud.storage.key.Key.set_contents_from_string>`
allows you to put a string into the key.
:func:`set_contents_from_string <gcloud.storage.blob.Blob.set_contents_from_string>`
allows you to put a string into the blob.

Now we can test if it worked::

>>> key = bucket.get_key('greeting.txt')
>>> print key.get_contents_as_string()
>>> blob = bucket.get_blob('greeting.txt')
>>> print blob.get_contents_as_string()
Hello world!

What if you want to save the contents to a file?

>>> key.get_contents_to_filename('greetings.txt')
>>> blob.get_contents_to_filename('greetings.txt')

Then you can look at the file in a terminal::

Expand All @@ -149,32 +148,32 @@ Then you can look at the file in a terminal::
And what about when you're not dealing with text?
That's pretty simple too::

>>> key = bucket.new_key('kitten.jpg')
>>> key.set_contents_from_filename('kitten.jpg')
>>> blob = bucket.new_blob('kitten.jpg')
>>> blob.set_contents_from_filename('kitten.jpg')

And to test whether it worked?

>>> key = bucket.get_key('kitten.jpg')
>>> key.get_contents_to_filename('kitten2.jpg')
>>> blob = bucket.get_blob('kitten.jpg')
>>> blob.get_contents_to_filename('kitten2.jpg')

and check if they are the same in a terminal::

$ diff kitten.jpg kitten2.jpg

Notice that we're using
:func:`get_key <gcloud.storage.bucket.Bucket.get_key>`
to retrieve a key we know exists remotely.
If the key doesn't exist, it will return ``None``.
:func:`get_blob <gcloud.storage.bucket.Bucket.get_blob>`
to retrieve a blob we know exists remotely.
If the blob doesn't exist, it will return ``None``.

.. note:: ``get_key`` is **not** retrieving the entire object's data.
.. note:: ``get_blob`` is **not** retrieving the entire object's data.

If you want to "get-or-create" the key
If you want to "get-or-create" the blob
(that is, overwrite it if it already exists),
you can use :func:`new_key <gcloud.storage.bucket.Bucket.new_key>`.
However, keep in mind, the key is not created
you can use :func:`new_blob <gcloud.storage.bucket.Bucket.new_blob>`.
However, keep in mind, the blob is not created
until you store some data inside of it.

If you want to check whether a key exists,
If you want to check whether a blob exists,
you can use the ``in`` operator in Python::

>>> print 'kitten.jpg' in bucket
Expand All @@ -191,17 +190,17 @@ to retrieve the bucket object::

>>> bucket = connection.get_bucket('my-bucket')

If you want to get all the keys in the bucket,
If you want to get all the blobs in the bucket,
you can use
:func:`get_all_keys <gcloud.storage.bucket.Bucket.get_all_keys>`::
:func:`get_all_blobs <gcloud.storage.bucket.Bucket.get_all_blobs>`::

>>> keys = bucket.get_all_keys()
>>> blobs = bucket.get_all_blobs()

However, if you're looking to iterate through the keys,
However, if you're looking to iterate through the blobs,
you can use the bucket itself as an iterator::

>>> for key in bucket:
... print key
>>> for blob in bucket:
... print blob

Deleting a bucket
-----------------
Expand Down Expand Up @@ -234,7 +233,7 @@ Managing access control
-----------------------

Cloud storage provides fine-grained access control
for both buckets and keys.
for both buckets and blobs.
`gcloud` tries to simplify access control
by working with entities and "grants".
On any ACL,
Expand Down
18 changes: 9 additions & 9 deletions docs/_components/storage-quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -53,22 +53,22 @@ and instantiating the demo connection::
>>> connection = demo.get_connection()

Once you have the connection,
you can create buckets and keys::
you can create buckets and blobs::

>>> connection.get_all_buckets()
[<Bucket: ...>, ...]
>>> bucket = connection.create_bucket('my-new-bucket')
>>> print bucket
<Bucket: my-new-bucket>
>>> key = bucket.new_key('my-test-file.txt')
>>> print key
<Key: my-new-bucket, my-test-file.txt>
>>> key = key.set_contents_from_string('this is test content!')
>>> print key.get_contents_as_string()
>>> blob = bucket.new_blob('my-test-file.txt')
>>> print blob
<Blob: my-new-bucket, my-test-file.txt>
>>> blob = blob.set_contents_from_string('this is test content!')
>>> print blob.get_contents_as_string()
'this is test content!'
>>> print bucket.get_all_keys()
[<Key: my-new-bucket, my-test-file.txt>]
>>> key.delete()
>>> print bucket.get_all_blobs()
[<Blob: my-new-bucket, my-test-file.txt>]
>>> blob.delete()
>>> bucket.delete()

.. note::
Expand Down
6 changes: 3 additions & 3 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@
datastore-transactions
datastore-batches
storage-api
storage-blobs
storage-buckets
storage-keys
storage-acl


Expand Down Expand Up @@ -48,5 +48,5 @@ Cloud Storage
from gcloud import storage
bucket = storage.get_bucket('<your-bucket-name>', '<your-project-id>')
key = bucket.new_key('my-test-file.txt')
key = key.upload_contents_from_string('this is test content!')
blob = bucket.new_blob('my-test-file.txt')
blob = blob.upload_contents_from_string('this is test content!')
2 changes: 1 addition & 1 deletion docs/storage-api.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
.. toctree::
:maxdepth: 0
:hidden:
:hidden:

Storage
-------
Expand Down
7 changes: 7 additions & 0 deletions docs/storage-blobs.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Blobs / Objects
~~~~~~~~~~~~~~~

.. automodule:: gcloud.storage.blob
:members:
:undoc-members:
:show-inheritance:
7 changes: 0 additions & 7 deletions docs/storage-keys.rst

This file was deleted.

8 changes: 4 additions & 4 deletions gcloud/storage/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,9 @@
>>> import gcloud.storage
>>> bucket = gcloud.storage.get_bucket('bucket-id-here', 'project-id')
>>> # Then do other things...
>>> key = bucket.get_key('/remote/path/to/file.txt')
>>> print key.get_contents_as_string()
>>> key.set_contents_from_string('New contents!')
>>> blob = bucket.get_blob('/remote/path/to/file.txt')
>>> print blob.get_contents_as_string()
>>> blob.set_contents_from_string('New contents!')
>>> bucket.upload_file('/remote/path/storage.txt', '/local/path.txt')
The main concepts with this API are:
Expand All @@ -32,7 +32,7 @@
- :class:`gcloud.storage.bucket.Bucket` which represents a particular
bucket (akin to a mounted disk on a computer).
- :class:`gcloud.storage.key.Key` which represents a pointer to a
- :class:`gcloud.storage.blob.Blob` which represents a pointer to a
particular entity in Cloud Storage (akin to a file path on a remote
machine).
"""
Expand Down
8 changes: 4 additions & 4 deletions gcloud/storage/_helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,11 +79,11 @@ def batch(self):
... bucket.enable_versioning()
... bucket.disable_website()
or for a key::
or for a blob::
>>> with key.batch:
... key.content_type = 'image/jpeg'
... key.content_encoding = 'gzip'
>>> with blob.batch:
... blob.content_type = 'image/jpeg'
... blob.content_encoding = 'gzip'
Updates will be aggregated and sent as a single call to
:meth:`_patch_properties` IFF the ``with`` block exits without
Expand Down
28 changes: 14 additions & 14 deletions gcloud/storage/acl.py
Original file line number Diff line number Diff line change
Expand Up @@ -491,15 +491,15 @@ class DefaultObjectACL(BucketACL):


class ObjectACL(ACL):
"""An ACL specifically for a key."""
"""An ACL specifically for a Cloud Storage object / blob.
def __init__(self, key):
"""
:type key: :class:`gcloud.storage.key.Key`
:param key: The key that this ACL corresponds to.
"""
:type blob: :class:`gcloud.storage.blob.Blob`
:param blob: The blob that this ACL corresponds to.
"""

def __init__(self, blob):
super(ObjectACL, self).__init__()
self.key = key
self.blob = blob

def reload(self):
"""Reload the ACL data from Cloud Storage.
Expand All @@ -509,16 +509,16 @@ def reload(self):
"""
self.entities.clear()

url_path = '%s/acl' % self.key.path
found = self.key.connection.api_request(method='GET', path=url_path)
url_path = '%s/acl' % self.blob.path
found = self.blob.connection.api_request(method='GET', path=url_path)
self.loaded = True
for entry in found['items']:
self.add_entity(self.entity_from_dict(entry))

return self

def save(self, acl=None):
"""Save the ACL data for this key.
"""Save the ACL data for this blob.
:type acl: :class:`gcloud.storage.acl.ACL`
:param acl: The ACL object to save. If left blank, this will
Expand All @@ -531,8 +531,8 @@ def save(self, acl=None):
save_to_backend = True

if save_to_backend:
result = self.key.connection.api_request(
method='PATCH', path=self.key.path, data={'acl': list(acl)},
result = self.blob.connection.api_request(
method='PATCH', path=self.blob.path, data={'acl': list(acl)},
query_params={'projection': 'full'})
self.entities.clear()
for entry in result['acl']:
Expand All @@ -542,11 +542,11 @@ def save(self, acl=None):
return self

def clear(self):
"""Remove all ACL rules from the key.
"""Remove all ACL rules from the blob.
Note that this won't actually remove *ALL* the rules, but it
will remove all the non-default rules. In short, you'll still
have access to a key that you created even after you clear ACL
have access to a blob that you created even after you clear ACL
rules with this method.
"""
return self.save([])
Loading

0 comments on commit 3e89521

Please sign in to comment.