[s3boto] cannot read chucks of file

Issue #68 new
Former user created an issue

According to the docs:



Files can be read in a little at a time, if necessary:

obj1.normal.open() obj1.normal.read(3) 'con' obj1.normal.read() 'tent' '-'.join(obj1.normal.chunks(chunk_size=2)) 'co-nt-en-t'


But this does not work for me with s3boto storage. Calling model_instance.file_field.read(3) returns the whole file. This is very problematic and causes difficulties using the django framework to transfer files between storages.

Please fix this

Comments (6)

  1. Anonymous

    this appears to have been an issue with the version of boto I was using. I just updated to version '2.0b3' and it works.

  2. BeProud

    Yes, this should be working as I spent a bit of time getting reading to work. You should be able to do things like:

    myfile = storage.open("/myfile")
    for chunk in myfile.chunks():
       #process chunk

    Though, currently as is written in another bug the backend loads the entire file into memory so it may not be as beneficial as one might like.

  3. Ian Lewis

    What versions of Django, django-storages and boto are you using?

    This worked perfectly fine for me in trunk:

    In [26]: x = ContentFile("content")
    In [27]: x.name = "content.txt"
    In [28]: m = MyModel()
    In [29]: m.t = x
    In [30]: m.save()
    In [31]: m.id
    Out[31]: 3
    In [32]: m = MyModel.objects.get(pk=m.id)
    In [33]: m.t.read(2)
    Out[33]: 'co'
    In [34]: m.t.read(2)
    Out[34]: 'nt'
    In [35]: m.t.read(2)
    Out[35]: 'en'
    In [36]: m.t.read(2)
    Out[36]: 't'
    In [37]: m = MyModel.objects.get(pk=m.id)
    In [38]: "-".join(m.t.chunks(chunk_size=2))
    Out[38]: 'co-nt-en-t'
  4. Log in to comment