NicolasPieuchot committed 68078be

Speed up exists function on buckets with large number of files (>10⁵)

  • Participants
  • Parent commits c751701

Comments (0)

Files changed (2)

File docs/backends/amazon-S3.rst

     STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
+``AWS_S3_NUMEROUS_FILES`` (optional)
+If you have numerous files in your bucket, some simple functions as `exists` might end up being very slow as it would first check across all the entries. This function can even take several minutes if you have hundred thousand files. This parameter set to `True` speeds up this function.

File storages/backends/

     custom_domain = setting('AWS_S3_CUSTOM_DOMAIN')
     calling_format = setting('AWS_S3_CALLING_FORMAT', SubdomainCallingFormat())
     secure_urls = setting('AWS_S3_SECURE_URLS', True)
+    numerous_files_bucket = setting('AWS_S3_NUMEROUS_FILES', False)
     file_name_charset = setting('AWS_S3_FILE_NAME_CHARSET', 'utf-8')
     gzip = setting('AWS_IS_GZIPPED', False)
     preload_metadata = setting('AWS_PRELOAD_METADATA', False)
     def exists(self, name):
         name = self._normalize_name(self._clean_name(name))
-        if self.entries:
-            return name in self.entries
+        if not self.numerous_files_bucket:
+            if self.entries:
+                return name in self.entries
         k = self.bucket.new_key(self._encode_name(name))
         return k.exists()