Chris Streeter committed a7e0659

Prevent listing huge buckets on S3.

Previously, in when `AWS_PRELOAD_METADATA` is True,
`S3BotoStorage.entries` would list every file in the bucket. However,
if there is a `location` specified on the storage, we don't use the
whole storage bucket with that storage backend. So pass our `location`
as the `prefix` to the call to `entries`.

Comments (0)

Files changed (1)


         if self.preload_metadata and not self._entries:
             self._entries = dict((self._decode_name(entry.key), entry)
-                                for entry in self.bucket.list())
+                                for entry in self.bucket.list(prefix=self.location))
         return self._entries
     def _get_access_keys(self):
Tip: Filter by directory path e.g. /media app.js to search for public/media/app.js.
Tip: Use camelCasing e.g. ProjME to search for
Tip: Filter by extension type e.g. /repo .js to search for all .js files in the /repo directory.
Tip: Separate your search with spaces e.g. /ssh pom.xml to search for src/ssh/pom.xml.
Tip: Use ↑ and ↓ arrow keys to navigate and return to view the file.
Tip: You can also navigate files with Ctrl+j (next) and Ctrl+k (previous) and view the file with Ctrl+o.
Tip: You can also navigate files with Alt+j (next) and Alt+k (previous) and view the file with Alt+o.