Welcome! Log In Create A New Profile

Advanced

How to handle concurrent requests with fastcgi?

Posted by Ameisen 
How to handle concurrent requests with fastcgi?
June 06, 2016 06:48PM
Hi there!

I'm writing an application that is using fastcgi coupled with nginx. I have currently set it to use keepalive sockets.

The application is designed so that any requests that come through are pushed onto an asynchronous queue and processed by multiple threads. However, I am not seeing behavior from nginx that is compatible with this - it appears as though it sends a request and does not send further requests until the first request is responded to or times out.

What I've also noticed is that if it times out, nginx appears to be killing the socket and creating a new one, which is also not particularly desirable behavior.

Is there something I'm doing wrong here? I know that nginx does not support multiplexing, but is there a way to get it so my application can be processing multiple requests from nginx concurrently? Insofar, I have been unable to get more than one request at a time from nginx.

Thank you!
Re: How to handle concurrent requests with fastcgi?
June 07, 2016 02:28AM
Sounds like you need co-sockets with Lua, search for openresty.

---
nginx for Windows http://nginx-win.ecsds.eu/
Re: How to handle concurrent requests with fastcgi?
June 07, 2016 03:25AM
The issue I'm seeing is that nginx never tries to open multiple sockets with the FastCGI application - I _am_ using coroutines with sockets - however, I never get more than one connection from the server at a time.
Re: How to handle concurrent requests with fastcgi?
June 07, 2016 03:50AM
It depends technically how you are processing requests to the backend(s), if you are expecting multiple responses you have to build a trigger state in Lua which waits for all results to complete, you can't handle this from the clients point of view (which may timeout anyway if the wait is too long). A restful connection would suite better here which should not suffer from timeouts or break performance.

---
nginx for Windows http://nginx-win.ecsds.eu/
Re: How to handle concurrent requests with fastcgi?
June 07, 2016 11:26AM
I'm not using Lua, so using Lua APIs won't help me much. What I am presently doing is listening for connections on the socket. When a connection is found (which will be nginx) I accept the socket, and push the new transaction socket to a fiber-based job queue which executes it and responds when it has time. Immediately after pushing off the job, the listener socket goes back to listening. Both nginx and my application are configured for keepalive sockets.

The issue I am seeing is that I never see more than one transaction socket from nginx at a time. The listener socket never sees any new connections - nginx opens one, and uses it to sequentially send requests. To handle multiple concurrent requests, I'd expect nginx to multiplex (which it doesn't support) or open multiple socket streams to my application, which it also is not doing.
Re: How to handle concurrent requests with fastcgi?
June 07, 2016 12:36PM
Ameisen Wrote:
-------------------------------------------------------
> requests. To handle multiple concurrent requests, I'd expect nginx to
> multiplex (which it doesn't support) or open multiple socket streams
> to my application, which it also is not doing.

You can't expect something to work if its not supported, such operation is usually done via a restfull api (like Lua) or use a loadbalancer nginx setup with 50 (socket)nodes in it all pointing to the same backend.

Some reading material;
http://stackoverflow.com/questions/7616601/nginx-fastcgi-and-open-sockets
http://michieldemey.be/blog/proxying-websockets-with-nginx-and-socket-io/
https://www.jayway.com/2015/04/13/600k-concurrent-websocket-connections-on-aws-using-node-js/

---
nginx for Windows http://nginx-win.ecsds.eu/
Re: How to handle concurrent requests with fastcgi?
June 07, 2016 12:58PM
I wasn't aware that it wasn't supported, as I've been unable to find clear documentation anywhere about it. The only thing that I knew was unsupported was FastCGI multiplexing, but the argument given in the newsgroups for why it wasn't (aside from PHP not supporting it) was that equivalent functionality could be gotten by nginx opening multiple sockets, so I presumed that that was supported - the fact that I never see multiple connections opened thus came as a surprise, as it has appeared to me that nginx is incapable of concurrent processing of FastCGI requests.

So, you're saying that the two ways I can get concurrent access to my application are:

1. Use nginx as an HTTP proxy and basically pass the HTTP requests to my application directly.
2. Use nginx's load balancing to set up a FastCGI loadbalancer, with a bunch of entries for... I suppose 127.0.0.1:PORT? I am unclear how this would work, as the documentation seems to presume I'm talking to remote machines with unique addresses. Would this even work?

http {
upstream backend {
server 127.0.0.1:9000;
server 127.0.0.1:9000;
server 127.0.0.1:9000;
}
server {
location / {
fastcgi_pass localhost:9000;
}
}
}

I feel like that would be invalid. Or would I need two instances of nginx running - one to handle load balancing, and one to receive the requests?



Edited 2 time(s). Last edit at 06/07/2016 01:00PM by Ameisen.
Re: How to handle concurrent requests with fastcgi?
June 07, 2016 01:24PM
That example is correct, this is how we use this with PHP since it can only handle one connection at the time, with a pool nginx divides it between its nodes, obviously the cgi part of php needs to run multiple times as well as its limit is per instance for most applications, but if your backend is none-blocking you can connect as much as you like.
The same goes for Lua and its co-sockets, embedded in nginx you have maximum none-blocking performance.

---
nginx for Windows http://nginx-win.ecsds.eu/
Re: How to handle concurrent requests with fastcgi?
June 07, 2016 01:33PM
So, the config file (roughly), coupled with one instance of nginx and one instance of my application (which has concurrent socket capabilities) will suffice? Would it potentially make sense to just add an alias as something like fastcgi_concurrentsockets or somesuch? I absolutely don't want multiple instances of my application running. It's not designed for multiprocess safety, only multithread.

Also, I'm not sure what Lua cosockets are, but the portmanteau sounds like 'coroutine sockets', in which case, that's what I'm doing (fibers + sockets).



Edited 1 time(s). Last edit at 06/07/2016 01:35PM by Ameisen.
Re: How to handle concurrent requests with fastcgi?
June 07, 2016 02:00PM
If it has concurrent socket capabilities then use it and define the balance pool for such use.

Now it does all depend what is understood under "concurrent socket capabilities" which way you go, same server same port/socket or same server different ports/sockets with a balancer

---
nginx for Windows http://nginx-win.ecsds.eu/
Re: How to handle concurrent requests with fastcgi?
June 07, 2016 02:14PM
All right, thank you. I'll try this tonight.

What is nginx doing internally if I don't do this? IIRC, it defaults to 1 worker process, and that defaults to something like 512 connections? If 100 people connect and all are doing something that needs to talk to my FastCGI application, is it processing them all sequentially?
Re: How to handle concurrent requests with fastcgi?
June 07, 2016 03:18PM
nginx by itself processes simultaneously (event driven), also embedded Lua does the same thing (hence its performance), but as soon as you have to proxy-pass anything the backend can be blocking if it can't process multiple requests over the same connection, this is not something nginx can do about (other then moving backend code/processing over to Lua for as much as possible).

For example when you have a BI application doing alot of queries you use Lua to cache the query/results and serve them from a Lua cache instead of passing each request to your BI app/database which will limit the amount of concurrent users on nginx's side.

Inside nginx everything is fast and can handle 100k easily, as soon as it needs to talk to the outside, the outside determines how many requests you can handle, hence using loadbalancing (multiple backends or spreading requests between daemons) and hence using embedded processing/caching whenever you can (Lua).

---
nginx for Windows http://nginx-win.ecsds.eu/
Re: How to handle concurrent requests with fastcgi?
June 07, 2016 03:41PM
I'm a bit confused again.

"but as soon as you have to proxy-pass anything the backend can be blocking if it can't process multiple requests over the same connection"

My backend (my FastCGI application) doesn't really block. It accepts connections on the listener socket, and defers them to other threads to process (which do so using a fiber pool) while the listener thread keeps listening for more connections. Why do I need to use loadbalancing to get concurrent FastCGI queries in this case, or is this a limitation of nginx's FastCGI module? As said, I'm fairly sure I'm doing the same thing that Lua does (coroutine sockets) but on the backend-side.

I presume that's what embedded Lua is doing in this case - taking the queries, putting them into a coroutine-socket pool, and kicking them off and then working on the next one pending the completion of the ones that were kicked off? Does this only function if it's done on the nginx side? -- as that is actually how my backend is architectured.



Edited 1 time(s). Last edit at 06/07/2016 03:41PM by Ameisen.
Re: How to handle concurrent requests with fastcgi?
June 07, 2016 04:45PM
Ameisen Wrote:
-------------------------------------------------------
> It accepts
> connections on the listener socket, and defers them to other threads
> to process (which do so using a fiber pool) while the listener thread
> keeps listening for more connections.

If this is how it really works then there should not be a problem with blocking.

Try using 'ab' with 250 requests and 250 concurrent users directly on your backend, if the results are the same as for 5 requests and 5 concurrent users then you have no blocking issues.

If with 250/250 takes alot longer then 5/5 your app is blocking and so will nginx.

[ab]: https://httpd.apache.org/docs/2.4/programs/ab.html

---
nginx for Windows http://nginx-win.ecsds.eu/
Re: How to handle concurrent requests with fastcgi?
June 07, 2016 04:50PM
With keepalive sockets, what should be expecting on the backend side? I presume what I will see is that every time a new concurrent connection is being made, a new transaction socket will be made, and at the end I will have <= 250 sockets that are just 'around' as they've been kept alive, and I will need to continue to put them on a recv?
Re: How to handle concurrent requests with fastcgi?
June 07, 2016 05:12PM
Just try the AB tests first, if that fails (blocks) nginx can't do much more.

---
nginx for Windows http://nginx-win.ecsds.eu/
Re: How to handle concurrent requests with fastcgi?
June 09, 2016 09:35AM
Unfortunately, I haven't yet had a chance to test this - I've been busy.

However, a thought occurred - if one were to use Unix file streams instead of sockets, how would you use multiple connections? Would the load balancing approach be the only way?... as they're not as flexible as POSIX sockets.
Re: How to handle concurrent requests with fastcgi?
June 09, 2016 10:09AM
Never mind that question. Nginx uses domain sockets.
Re: How to handle concurrent requests with fastcgi?
July 08, 2016 11:52AM
Hi Ameisen,,

How did you resolve this issue? I am facing same issue with FastCGI application too.

It accepts
> connections on the listener socket, and defers them to other threads
> to process (which do so using a fiber pool) while the listener thread
> keeps listening for more connections.

This is how many network applications does accept concurrent requests by doing multiplexing which is fundamental of network programming. I am too surprised that nginx not allowing its backend applications to do that.
Sorry, only registered users may post in this forum.

Click here to login

Online Users

Guests: 162
Record Number of Users: 8 on April 13, 2023
Record Number of Guests: 421 on December 02, 2018
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready