Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Google Cloud Storage backend using the gcloud-python library #236

Merged
merged 28 commits into from
Apr 19, 2017
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
2f6f6e7
Add Google Cloud Storage backend using the gcloud-python library
eirsyl Apr 26, 2016
85c53b5
Rename to gcloud, and import google.cloud
scjody Dec 14, 2016
b3ec8b5
Add @deconstructible to GoogleCloudStorage
scjody Dec 14, 2016
1ddc32b
Remove MIME type guessing
scjody Jan 5, 2017
8105cba
Fix name in error message
scjody Jan 5, 2017
7cb67d5
Move clean_name to utils
scjody Jan 5, 2017
159b003
Move clean_name tests to test_utils
scjody Jan 5, 2017
621391a
Use utils.clean_name()
scjody Jan 5, 2017
058972f
Remove _normalize_name
scjody Jan 5, 2017
cbe2530
Remove unused import and class variable
scjody Jan 9, 2017
189233a
Move safe_join to utils
scjody Jan 11, 2017
3829f9a
Add and use _normalize_name() like in s3boto
scjody Jan 11, 2017
e16800e
Remove unused function
scjody Jan 11, 2017
0d61513
Only create a Blob in write mode
scjody Jan 11, 2017
dfe90b5
Add tests of Google Cloud Storage
scjody Jan 11, 2017
9ad54cb
Add documentation for Google Cloud Storage
scjody Jan 11, 2017
b37a9cb
Import Storage directly
scjody Jan 11, 2017
74f3841
Use byte string for test read
scjody Jan 11, 2017
e8fc9fb
Add Google Cloud Storage authors
scjody Feb 6, 2017
737ac57
Address review comments
scjody Apr 11, 2017
44f39cd
Fix modified_time; add get_modified_time
scjody Apr 11, 2017
bb9307f
Test and fix unicode handling
scjody Apr 11, 2017
acbe31d
Address further review comments
scjody Apr 12, 2017
02e5829
Remove *args and **kwargs
scjody Apr 12, 2017
1d2e206
Add deprecation notice to 'gs' backend
scjody Apr 12, 2017
bbb203e
Print deprecation notice as warning
scjody Apr 16, 2017
598753b
Simplify ACL options and improve ACL documentation
scjody Apr 18, 2017
7ee116b
Address final PR comments
scjody Apr 19, 2017
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions AUTHORS
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,8 @@ By order of apparition, thanks:
* Michael Barrientos (S3 with Boto3)
* piglei (patches)
* Matt Braymer-Hayes (S3 with Boto3)
* Eirik Martiniussen Sylliaas (Google Cloud Storage native support)
* Jody McIntyre (Google Cloud Storage native support)

Extra thanks to Marty for adding this in Django,
you can buy his very interesting book (Pro Django).
194 changes: 194 additions & 0 deletions docs/backends/gcloud.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,194 @@
Google Cloud Storage
====================

Usage
*****

This backend provides support for Google Cloud Storage using the
library provided by Google.

It's possible to access Google Cloud Storage in S3 compatibility mode
using other libraries in django-storages, but this is the only library
offering native support.

By default this library will use the credentials associated with the
current instance for authentication. To override this, see the
settings below.


Settings
--------

To use gcloud set::

DEFAULT_FILE_STORAGE = 'storages.backends.gcloud.GoogleCloudStorage'

``GS_BUCKET_NAME``

Your Google Storage bucket name, as a string.

``GS_PROJECT_ID`` (optional)

Your Google Cloud project ID. If unset, falls back to the default
inferred from the environment.

``GS_CREDENTIALS`` (optional)

The OAuth 2 credentials to use for the connection. If unset, falls
back to the default inferred from the environment.

``GS_AUTO_CREATE_BUCKET`` (optional, default is ``False``)

If True, attempt to create the bucket if it does not exist.

``GS_AUTO_CREATE_ACL`` (optional, default is ``projectPrivate``)

ACL used when creating a new bucket, from the
`list of predefined ACLs <https://cloud.google.com/storage/docs/access-control/lists#predefined-acl>`_.
(A "JSON API" ACL is preferred but an "XML API/gsutil" ACL will be
translated.)

Note that the ACL you select must still give the service account
running the gcloud backend to have OWNER permission on the bucket. If
you're using the default service account, this means you're restricted
to the ``projectPrivate`` ACL.

``GS_FILE_CHARSET`` (optional)

Allows overriding the character set used in filenames.

``GS_FILE_OVERWRITE`` (optional: default is ``True``)

By default files with the same name will overwrite each other. Set this to ``False`` to have extra characters appended.

``GS_MAX_MEMORY_SIZE`` (optional)

The maximum amount of memory a returned file can take up before being
rolled over into a temporary file on disk. Default is 0: Do not roll over.

Fields
------

Once you're done, default_storage will be Google Cloud Storage::

>>> from django.core.files.storage import default_storage
>>> print default_storage.__class__
<class 'storages.backends.gcloud.GoogleCloudStorage'>

This way, if you define a new FileField, it will use the Google Cloud Storage::

>>> from django.db import models
>>> class Resume(models.Model):
... pdf = models.FileField(upload_to='pdfs')
... photos = models.ImageField(upload_to='photos')
...
>>> resume = Resume()
>>> print resume.pdf.storage
<storages.backends.gcloud.GoogleCloudStorage object at ...>

Storage
-------

Standard file access options are available, and work as expected::

>>> default_storage.exists('storage_test')
False
>>> file = default_storage.open('storage_test', 'w')
>>> file.write('storage contents')
>>> file.close()

>>> default_storage.exists('storage_test')
True
>>> file = default_storage.open('storage_test', 'r')
>>> file.read()
'storage contents'
>>> file.close()

>>> default_storage.delete('storage_test')
>>> default_storage.exists('storage_test')
False

Model
-----

An object without a file has limited functionality::

>>> obj1 = MyStorage()
>>> obj1.normal
<FieldFile: None>
>>> obj1.normal.size
Traceback (most recent call last):
...
ValueError: The 'normal' attribute has no file associated with it.

Saving a file enables full functionality::

>>> obj1.normal.save('django_test.txt', ContentFile('content'))
>>> obj1.normal
<FieldFile: tests/django_test.txt>
>>> obj1.normal.size
7
>>> obj1.normal.read()
'content'

Files can be read in a little at a time, if necessary::

>>> obj1.normal.open()
>>> obj1.normal.read(3)
'con'
>>> obj1.normal.read()
'tent'
>>> '-'.join(obj1.normal.chunks(chunk_size=2))
'co-nt-en-t'

Save another file with the same name::

>>> obj2 = MyStorage()
>>> obj2.normal.save('django_test.txt', ContentFile('more content'))
>>> obj2.normal
<FieldFile: tests/django_test_.txt>
>>> obj2.normal.size
12

Push the objects into the cache to make sure they pickle properly::

>>> cache.set('obj1', obj1)
>>> cache.set('obj2', obj2)
>>> cache.get('obj2').normal
<FieldFile: tests/django_test_.txt>

Deleting an object deletes the file it uses, if there are no other objects still using that file::

>>> obj2.delete()
>>> obj2.normal.save('django_test.txt', ContentFile('more content'))
>>> obj2.normal
<FieldFile: tests/django_test_.txt>

Default values allow an object to access a single file::

>>> obj3 = MyStorage.objects.create()
>>> obj3.default
<FieldFile: tests/default.txt>
>>> obj3.default.read()
'default content'

But it shouldn't be deleted, even if there are no more objects using it::

>>> obj3.delete()
>>> obj3 = MyStorage()
>>> obj3.default.read()
'default content'

Verify the fix for #5655, making sure the directory is only determined once::

>>> obj4 = MyStorage()
>>> obj4.random.save('random_file', ContentFile('random content'))
>>> obj4.random
<FieldFile: .../random_file>

Clean up the temporary files::

>>> obj1.normal.delete()
>>> obj2.normal.delete()
>>> obj3.default.delete()
>>> obj4.random.delete()
Loading