Commits

Rich Leland committed 341c2d4

copied documentation over from wiki

Comments (0)

Files changed (12)

docs/backends/amazon-S3.rst

+Amazon S3
+=========
+
+Usage
+*****
+
+There are two backend APIs for interacting with S3. The first is the s3 backend (in storages/backends/s3.py) which is simple and based on the Amazon S3 Python library. The second is the s3boto backend (in storages/backends/s3boto.py) which is well-maintained by the community and is generally more robust (including connection pooling, etc...). s3boto requires the python-boto library.
+
+Settings
+--------
+
+``DEFAULT_FILE_STORAGE``
+
+This setting sets the path to the S3 storage class, the first part correspond to the filepath and the second the name of the class, if you've got example.com in your PYTHONPATH and store your storage file in example.com/libs/storages/S3Storage.py, the resulting setting will be::
+
+    DEFAULT_FILE_STORAGE = 'libs.storages.S3Storage.S3Storage'
+
+or if you installed using setup.py::
+
+    DEFAULT_FILE_STORAGE = 'storages.backends.s3.S3Storage'
+
+If you keep the same filename as in repository, it should always end with S3Storage.S3Storage.
+
+To use s3boto, this setting will be::
+
+    DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
+
+``AWS_ACCESS_KEY_ID``
+
+Your Amazon Web Services access key, as a string.
+
+``AWS_SECRET_ACCESS_KEY``
+
+Your Amazon Web Services secret access key, as a string.
+
+``AWS_STORAGE_BUCKET_NAME``
+
+Your Amazon Web Services storage bucket name, as a string.
+
+``AWS_CALLING_FORMAT`` (Subdomain hardcoded in s3boto)
+
+The way you'd like to call the Amazon Web Services API, for instance if you prefer subdomains::
+
+    from S3 import CallingFormat
+    AWS_CALLING_FORMAT = CallingFormat.SUBDOMAIN
+
+``AWS_HEADERS`` (optional)
+
+If you'd like to set headers sent with each file of the storage::
+
+    # see http://developer.yahoo.com/performance/rules.html#expires
+    AWS_HEADERS = {
+        'Expires': 'Thu, 15 Apr 2010 20:00:00 GMT',
+        'Cache-Control': 'max-age=86400',
+    }
+
+To allow ``django-admin.py`` collectstatic to automatically put your static files in your bucket set the following in your settings.py::
+
+    STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
+
+Fields
+------
+
+Once you're done, default_storage will be the S3 storage::
+
+    >>> from django.core.files.storage import default_storage
+    >>> print default_storage.__class__
+    <class 'S3Storage.S3Storage'>
+
+The above doesn't seem to be true for django 1.3+ instead look at::
+
+    >>> from django.core.files.storage import default_storage
+    >>> print default_storage.connection
+    S3Connection:s3.amazonaws.com
+
+This way, if you define a new FileField, it will use the S3 storage::
+
+    >>> from django.db import models
+    >>> class Resume(models.Model):
+    ...     pdf = models.FileField(upload_to='pdfs')
+    ...     photos = models.ImageField(upload_to='photos')
+    ...
+    >>> resume = Resume()
+    >>> print resume.pdf.storage
+    <S3Storage.S3Storage object at ...>
+
+Tests
+*****
+
+Initialization::
+
+    >>> from django.core.files.storage import default_storage
+    >>> from django.core.files.base import ContentFile
+    >>> from django.core.cache import cache
+    >>> from models import MyStorage
+
+Storage
+-------
+
+Standard file access options are available, and work as expected::
+
+    >>> default_storage.exists('storage_test')
+    False
+    >>> file = default_storage.open('storage_test', 'w')
+    >>> file.write('storage contents')
+    >>> file.close()
+
+    >>> default_storage.exists('storage_test')
+    True
+    >>> file = default_storage.open('storage_test', 'r')
+    >>> file.read()
+    'storage contents'
+    >>> file.close()
+
+    >>> default_storage.delete('storage_test')
+    >>> default_storage.exists('storage_test')
+    False
+
+Model
+-----
+
+An object without a file has limited functionality::
+
+    >>> obj1 = MyStorage()
+    >>> obj1.normal
+    <FieldFile: None>
+    >>> obj1.normal.size
+    Traceback (most recent call last):
+    ...
+    ValueError: The 'normal' attribute has no file associated with it.
+
+Saving a file enables full functionality::
+
+    >>> obj1.normal.save('django_test.txt', ContentFile('content'))
+    >>> obj1.normal
+    <FieldFile: tests/django_test.txt>
+    >>> obj1.normal.size
+    7
+    >>> obj1.normal.read()
+    'content'
+
+Files can be read in a little at a time, if necessary::
+
+    >>> obj1.normal.open()
+    >>> obj1.normal.read(3)
+    'con'
+    >>> obj1.normal.read()
+    'tent'
+    >>> '-'.join(obj1.normal.chunks(chunk_size=2))
+    'co-nt-en-t'
+
+Save another file with the same name::
+
+    >>> obj2 = MyStorage()
+    >>> obj2.normal.save('django_test.txt', ContentFile('more content'))
+    >>> obj2.normal
+    <FieldFile: tests/django_test_.txt>
+    >>> obj2.normal.size
+    12
+
+Push the objects into the cache to make sure they pickle properly::
+
+    >>> cache.set('obj1', obj1)
+    >>> cache.set('obj2', obj2)
+    >>> cache.get('obj2').normal
+    <FieldFile: tests/django_test_.txt>
+
+Deleting an object deletes the file it uses, if there are no other objects still using that file::
+
+    >>> obj2.delete()
+    >>> obj2.normal.save('django_test.txt', ContentFile('more content'))
+    >>> obj2.normal
+    <FieldFile: tests/django_test_.txt>
+
+Default values allow an object to access a single file::
+
+    >>> obj3 = MyStorage.objects.create()
+    >>> obj3.default
+    <FieldFile: tests/default.txt>
+    >>> obj3.default.read()
+    'default content'
+
+But it shouldn't be deleted, even if there are no more objects using it::
+
+    >>> obj3.delete()
+    >>> obj3 = MyStorage()
+    >>> obj3.default.read()
+    'default content'
+
+Verify the fix for #5655, making sure the directory is only determined once::
+
+    >>> obj4 = MyStorage()
+    >>> obj4.random.save('random_file', ContentFile('random content'))
+    >>> obj4.random
+    <FieldFile: .../random_file>
+
+Clean up the temporary files::
+
+    >>> obj1.normal.delete()
+    >>> obj2.normal.delete()
+    >>> obj3.default.delete()
+    >>> obj4.random.delete()

docs/backends/couchdb.rst

+CouchDB
+=======
+
+A custom storage system for Django with CouchDB backend.
+

docs/backends/database.rst

+Database
+========
+
+Class DatabaseStorage can be used with either FileField or ImageField. It can be used to map filenames to database blobs: so you have to use it with a special additional table created manually. The table should contain a pk-column for filenames (better to use the same type that FileField uses: nvarchar(100)), blob field (image type for example) and size field (bigint). You can't just create blob column in the same table, where you defined FileField, since there is no way to find required row in the save() method. Also size field is required to obtain better perfomance (see size() method).
+
+So you can use it with different FileFields and even with different "upload_to" variables used. Thus it implements a kind of root filesystem, where you can define dirs using "upload_to" with FileField and store any files in these dirs.
+
+It uses either settings.DB_FILES_URL or constructor param 'base_url' (see __init__()) to create urls to files. Base url should be mapped to view that provides access to files. To store files in the same table, where FileField is defined you have to define your own field and provide extra argument (e.g. pk) to save().
+
+Raw sql is used for all operations. In constructor or in DB_FILES of settings.py () you should specify a dictionary with db_table, fname_column, blob_column, size_column and 'base_url'. For example I just put to the settings.py the following line::
+
+    DB_FILES = {
+        'db_table': 'FILES',
+        'fname_column':  'FILE_NAME',
+        'blob_column': 'BLOB',
+        'size_column': 'SIZE',
+        'base_url': 'http://localhost/dbfiles/'
+    }
+
+And use it with ImageField as following::
+
+    player_photo = models.ImageField(upload_to="player_photos", storage=DatabaseStorage() )
+
+DatabaseStorage class uses your settings.py file to perform custom connection to your database.
+
+The reason to use custom connection: http://code.djangoproject.com/ticket/5135 Connection string looks like::
+
+    cnxn = pyodbc.connect('DRIVER={SQL Server};SERVER=localhost;DATABASE=testdb;UID=me;PWD=pass')
+
+It's based on pyodbc module, so can be used with any database supported by pyodbc. I've tested it with MS Sql Express 2005.
+
+Note: It returns special path, which should be mapped to special view, which returns requested file::
+
+    def image_view(request, filename):
+        import os
+        from django.http import HttpResponse
+        from django.conf import settings
+        from django.utils._os import safe_join
+        from filestorage import DatabaseStorage
+        from django.core.exceptions import ObjectDoesNotExist
+
+        storage = DatabaseStorage()
+
+        try:
+            image_file = storage.open(filename, 'rb')
+            file_content = image_file.read()
+        except:
+            filename = 'no_image.gif'
+            path = safe_join(os.path.abspath(settings.MEDIA_ROOT), filename)
+            if not os.path.exists(path):
+                raise ObjectDoesNotExist
+            no_image = open(path, 'rb')
+            file_content = no_image.read()
+
+        response = HttpResponse(file_content, mimetype="image/jpeg")
+        response['Content-Disposition'] = 'inline; filename=%s'%filename
+        return response
+
+Note: If filename exist, blob will be overwritten, to change this remove get_available_name(self, name), so Storage.get_available_name(self, name) will be used to generate new filename.

docs/backends/ftp.rst

+FTP
+===
+
+.. warning:: This FTP storage is not prepared to work with large files, because it uses memory for temporary data storage. It also does not close FTP connection automatically (but open it lazy and try to reestablish when disconnected).
+
+This implementation was done preliminary for upload files in admin to remote FTP location and read them back on site by HTTP. It was tested mostly in this configuration, so read/write using FTPStorageFile class may break.
+

docs/backends/image.rst

+Image
+=====
+
+A custom FileSystemStorage made for normalizing extensions. It lets PIL look at the file to determine the format and append an always lower-case extension based on the results.
+

docs/backends/mogilefs.rst

+MogileFS
+========
+
+This storage allows you to use MogileFS, it comes from this blog post.
+
+The MogileFS storage backend is fairly simple: it uses URLs (or, rather, parts of URLs) as keys into the mogile database. When the user requests a file stored by mogile (say, an avatar), the URL gets passed to a view which, using a client to the mogile tracker, retrieves the "correct" path (the path that points to the actual file data). The view will then either return the path(s) to perlbal to reproxy, or, if you're not using perlbal to reproxy (which you should), it serves the data of the file directly from django.
+
+* ``MOGILEFS_DOMAIN``: The mogile domain that files should read from/written to, e.g "production"
+* ``MOGILEFS_TRACKERS``: A list of trackers to connect to, e.g. ["foo.sample.com:7001", "bar.sample.com:7001"]
+* ``MOGILEFS_MEDIA_URL`` (optional): The prefix for URLs that point to mogile files. This is used in a similar way to ``MEDIA_URL``, e.g. "/mogilefs/"
+* ``SERVE_WITH_PERLBAL``: Boolean that, when True, will pass the paths back in the response in the ``X-REPROXY-URL`` header. If False, django will serve all mogile media files itself (bad idea for production, but useful if you're testing on a setup that doesn't have perlbal running)
+* ``DEFAULT_FILE_STORAGE``: This is the class that's used for the backend. You'll want to set this to ``project.app.storages.MogileFSStorage`` (or wherever you've installed the backend)
+
+Getting files into mogile
+*************************
+
+The great thing about file backends is that we just need to specify the backend in the model file and everything is taken care for us - all the default save() methods work correctly.
+
+For Fluther, we have two main media types we use mogile for: avatars and thumbnails. Mogile defines "classes" that dictate how each type of file is replicated - so you can make sure you have 3 copies of the original avatar but only 1 of the thumbnail.
+
+In order for classes to behave nicely with the backend framework, we've had to do a little tomfoolery. (This is something that may change in future versions of the filestorage framework).
+
+Here's what the models.py file looks like for the avatars::
+
+    from django.core.filestorage import storage
+
+    # TODO: Find a better way to deal with classes. Maybe a generator?
+    class AvatarStorage(storage.__class__):
+        mogile_class = 'avatar'
+
+    class ThumbnailStorage(storage.__class__):
+        mogile_class = 'thumb'
+
+    class Avatar(models.Model):
+        user = models.ForeignKey(User, null=True, blank=True)
+        image = models.ImageField(storage=AvatarStorage())
+        thumb = models.ImageField(storage=ThumbnailStorage())
+
+Each of the custom storage classes defines a class attribute which gets passed to the mogile backend behind the scenes. If you don't want to worry about mogile classes, don't need to define a custom storage engine or specify it in the field - the default should work just fine.
+
+Serving files from mogile
+*************************
+
+Now, all we need to do is plug in the view that serves up mogile data.
+
+Here's what we use::
+
+    urlpatterns += patterns(",
+        (r'^%s(?P<key>.*)' % settings.MOGILEFS_MEDIA_URL[1:],
+            'MogileFSStorage.serve_mogilefs_file')
+    )
+
+Any url beginning with the value of ``MOGILEFS_MEDIA_URL`` will get passed to our view. Since ``MOGILEFS_MEDIA_URL`` requires a leading slash (like ``MEDIA_URL``), we strip that off and pass the rest of the url over to the view.
+
+That's it! Happy mogiling!

docs/backends/mongodb.rst

+MongoDB
+=======
+
+A GridFS backend that works with django_mongodb_engine and the upcoming GSoC 2010 MongoDB backend which gets developed by Alex Gaynor.
+
+Usage (in settings.py)::
+
+    DATABASES = {
+        'default': {
+            'ENGINE': 'django_mongodb_engine.mongodb',
+            'NAME': 'test',
+            'USER': '',
+            'PASSWORD': '',
+            'HOST': 'localhost',
+            'PORT': 27017,
+            'SUPPORTS_TRANSACTIONS': False,
+        }
+    }
+
+    DEFAULT_FILE_STORAGE = 'storages.backends.mongodb.GridFSStorage'
+    GRIDFS_DATABASE = 'default'

docs/backends/overwrite.rst

+Overwrite
+=========
+
+This is a simple implementation overwrite of the FileSystemStorage. It removes the addition of an '_' to the filename if the file already exists in the storage system. I needed a model in the admin area to act exactly like a file system (overwriting the file if it already exists).
+

docs/backends/rackspace-cloudfiles.rst

+Rackspace CloudFiles
+====================
+
+Requirements
+************
+
+Mosso's Cloud Files python module http://www.mosso.com/cloudfiles.jsp
+
+Usage
+*****
+
+Add the following to your project's settings.py file::
+
+    CLOUDFILES_USERNAME = 'YourUsername'
+    CLOUDFILES_API_KEY = 'YourAPIKey'
+    CLOUDFILES_CONTAINER = 'ContainerName'
+    DEFAULT_FILE_STORAGE = 'backends.mosso.CloudFilesStorage'
+
+Optionally, you can implement the following custom upload_to in your models.py file. This will upload the file using the file name only to Cloud Files (e.g. 'myfile.jpg'). If you supply a string (e.g. upload_to='some/path'), your file name will include the path (e.g. 'some/path/myfile.jpg')::
+
+    from backends.mosso import cloudfiles_upload_to
+
+    class SomeKlass(models.Model):
+        some_field = models.ImageField(upload_to=cloudfiles_upload_to)
+
+Alternatively, if you don't want to set the DEFAULT_FILE_STORAGE, you can do the following in your models::
+
+    from backends.mosso import CloudFilesStorage, cloudfiles_upload_to
+
+    cloudfiles_storage = CloudFilesStorage()
+
+    class SomeKlass(models.Model):
+        some_field = models.ImageField(storage=cloudfiles_storage,
+                                       upload_to=cloudfiles_upload_to)

docs/backends/sftp.rst

+SFTP
+====
+
+Take a look at the top of the backend's file for the documentation.
+

docs/backends/symlinkcopy.rst

+Symlink or copy
+===============
+
+Stores symlinks to files instead of actual files whenever possible
+
+When a file that's being saved is currently stored in the symlink_within directory, then symlink the file. Otherwise, copy the file.
-.. django-storages documentation master file, created by
-   sphinx-quickstart on Sun Aug 28 13:44:45 2011.
-   You can adapt this file completely to your liking, but it should at least
-   contain the root `toctree` directive.
+django-storages
+===============
 
-Welcome to django-storages's documentation!
-===========================================
-
-Contents:
+django-storages is a collection of custom storage backends for Django.
 
 .. toctree::
-   :maxdepth: 2
+   :maxdepth: 1
+   :glob:
+
+   backends/*
+
+Installation
+************
+
+Use pip to install from PyPI::
+
+    pip install django-storages
+
+Add ``storages`` to your settings.py file::
+
+    INSTALLED_APPS = (
+        ...
+        'storages',
+        ...
+    )
+
+Each storage backend has its own unique settings you will need to add to your settings.py file. Read the documentation for your storage engine(s) of choice to determine what you need to add.
+
+Contributing
+************
+
+To contribute to django-storages `create a fork`_ on bitbucket. Clone your fork, make some changes, and submit a pull request.
+
+.. _create a fork: https://bitbucket.org/david/django-storages/fork
+
+Issues
+******
+
+Use the bitbucket `issue tracker`_ for django-storages to submit bugs, issues, and feature requests.
+
+.. _issue tracker: https://bitbucket.org/david/django-storages/issues
 
 Indices and tables
 ==================
Tip: Filter by directory path e.g. /media app.js to search for public/media/app.js.
Tip: Use camelCasing e.g. ProjME to search for ProjectModifiedEvent.java.
Tip: Filter by extension type e.g. /repo .js to search for all .js files in the /repo directory.
Tip: Separate your search with spaces e.g. /ssh pom.xml to search for src/ssh/pom.xml.
Tip: Use ↑ and ↓ arrow keys to navigate and return to view the file.
Tip: You can also navigate files with Ctrl+j (next) and Ctrl+k (previous) and view the file with Ctrl+o.
Tip: You can also navigate files with Alt+j (next) and Alt+k (previous) and view the file with Alt+o.