Kamil Gorlo
July 18, 2011 09:50AM
Hi guys,

I am forwarding questions from my friend (it seems that he has some
problems with posting to this group). Hope it's OK for you.

Cheers,
Kamil

---------- Forwarded message ----------
From: Łukasz Osipiuk <lukasz@osipiuk.net>
Date: Mon, Jul 18, 2011 at 3:46 PM
Subject: Fwd: Backend handling of Expect: 100-Continue header
To: Kamil Gorlo <kgorlo@gmail.com>


---------- Forwarded message ----------
From: Łukasz Osipiuk <lukasz@osipiuk.net>
Date: Mon, Jul 18, 2011 at 15:25
Subject: Backend handling of Expect: 100-Continue header
To: nginx@nginx.org


Hi!

I searched through archives and I see that topic appeared on the list
in the past but I did not find
any concrete answers.

We use nginx to proxy large PUT (100M+) requests to backend servers.
When issuing an request clients sets "Expect: 100-Continue" header.

We noticed that nginx does not forward request to backend server until
it consumes it completely.
That is not what we want.

1. We need to disable automatic 100-Continue handling in nginx.
Instead we need nginx to forward headers to backend server and let it
determine if 100 Continue or final error code should be returned to
client.
2. We need to disable request body buffering as we have upload request
resuming implemented on the backend tier.

Are above achievable in current nginx version (maybe with some 3rd
party modules)? AFAIK mogilefs module proxies uploaded file directly
to backend servers withouth buffering.

Regards, Łukasz Osipiuk


--
Łukasz Osipiuk
mailto:lukasz@osipiuk.net



--
--
Łukasz Osipiuk
mailto:lukasz@osipiuk.net

_______________________________________________
nginx mailing list
nginx@nginx.org
http://nginx.org/mailman/listinfo/nginx
Subject Author Posted

Fwd: Backend handling of Expect: 100-Continue header

Kamil Gorlo July 18, 2011 09:50AM

Re: Fwd: Backend handling of Expect: 100-Continue header

Igor Sysoev July 18, 2011 11:18AM



Sorry, only registered users may post in this forum.

Click here to login

Online Users

Guests: 178
Record Number of Users: 8 on April 13, 2023
Record Number of Guests: 421 on December 02, 2018
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready