It would be much nicer if saving a file could skip early if the same content already exists on S3.
Overwriting each file with collectstatic is particularly painful... especially when a project has thousands of static files.
There are several ways this could be accomplished. My initial thought is to use key metadata and store a content hash that can be checked before the file is pushed to s3.
I am not sure of any benefit for reuploading identical content, but if there is maybe a inplace copy could be used.
S3BotoStorage._save_content seems like a good place for this to happen.