Issues

Issue #20 invalid

Timeout issues when using EC2 and S3 Storage?

Anonymous created an issue

Hi David

I'm using django-storages's S3 class (not boto) to store files from my django app. On local runserver, this seems fine with the kind of file sizes the users will be uploading, but I've recenlty staged the code on EC2 and have hit some timeout issues.

Basically, when I upload a small (eg < 100kb file) to my EC2 Django app, that all works fine, but when i try to upload something larger (a 2.4MB file), after a while i get an error from httplib.py of (110, 'Connection timed out')

Have you come across this at all?

I think what might be happening is that when I upload a file to my EC2 server, django-storages goes and opens an S3 connection, but doesn't necessarily write to S3 yet, because the file is still being uploaded from local to EC2. Then, when the file has arrived at the EC2 box, if it's a small one, that's fine and can be stored on S3, but if it's a large one, it has taken too long to reach the EC2 box and the S3 connection has timed out.

If the code does work this way and if my guess is right, is there a way to make django-storages wait for the file before it opens its S3 connection?

Cheers Steve

Comments (2)

  1. David Larlet repo owner
    • changed status to open

    Ok, more details here :)

    Is there an S3 parameter/header to specify the connection's duration? What about using chunks to send the large file once it gets received by EC2?

    There is no way to make a storage waiting for a file right now, but that could be an improvement.

  2. Log in to comment