August 29, 2017 05:13AM
Original Problem domain:

Rails/Rack send_data method attempts downloading whole file onto the server in memory, to then send to the client. This would fail with large files because the server ran out of memory.

Original Solution:

Use Passenger NGINX to create a proxy_pass between the client and the S3 bucket in a secure way (the client never sees the actual S3 bucket URL - opted for this over timed URLs, because the clients might need the URL indefinitely).

New Problem domain:

NGINX handles the download properly, but then once the download gets to about 5 GB, the download "fails" in Chrome and needs to be "retried" in order to continue downloading the file. This is due to a restriction S3 has with downloading files larger than 5 GB.

What I'm hoping to have answered:

Is there a way to initiate a multipart-like download using only NGINX? I know that there is such thing as a multipart upload, but I would like a multipart download, of some sort. Because Rails makes a single response to a request (without doing some very clunky magic to make it do otherwise) I'd like to use something like the Range header with NGINX, except I don't want to specify the exact range, because that means I have to make several responses (which is the case with the header, it looks like, and forces me into the clunky rails issue).

Thanks for any and all help!

-Stu
Subject Author Posted

Downloading large ( + 5GB ) files in a multipart fashion from S3

stuartweir August 29, 2017 05:13AM



Sorry, only registered users may post in this forum.

Click here to login

Online Users

Guests: 175
Record Number of Users: 8 on April 13, 2023
Record Number of Guests: 421 on December 02, 2018
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready