Welcome! Log In Create A New Profile

Advanced

Single Concurrency

Posted by sacon 
Single Concurrency
June 01, 2015 10:47AM
I have a legacy application / database server that has very limited connectivity options.

It does have CGI built in, but CGI obviously has overheads. I can't use Fast-CGI as there are reliability issues.

Alternatively, a TCP port can be opened on the server, which can then handle HTTP requests.

However, the TCP port is not concurrent. Only one request can access the port at a time.

I have seen a proprietary solution that starts up a pool of TCP ports, and uses a custom Apache module to balance requests across the pool of TCP connections.

From an architectural point of view, I'm guessing this is no different to a pool of database connections, the only difference being that each connection is on a unique port.

As the Apache solution is proprietary, I am looking for an alternative solution.

Looking at the NGinx documentation, I think it is possible to set up a reverse proxy across many different ports, but would this work with single concurrency?

Would this be possible with configuration alone on NGinx?

Open to any other suggestions...
Re: Single Concurrency
June 01, 2015 12:49PM
Any backend loadbalancing solution depends on having multiple backends, if your legacy application can only handle 1 connection and you add another 9 you still can handle only 10 concurrent users.

nginx can handle thousands of concurrent users and pass them on to a backend, so in your case you are going to get a very large que.

Can your legacy application handle requests on different ports? or is it just listening on more ports with the same limit?

Maybe you could move the functionality of your legacy application to Lua and just let it deal with data.

---
nginx for Windows http://nginx-win.ecsds.eu/
Re: Single Concurrency
June 01, 2015 05:21PM
I agree in the sense that traditional load balancing expects multiple back ends.

What I am trying to ascertain is if I can utilise load balancing in Nginx as a way to load balance requests to a single backend. It's a hack depending on how you look at it, but its the same principle.

Consider PostgreSQL, whilst NGinx will handle 10,000 requests in front of it, its optimal to only process 5, 10 or 20 queries at a time on PostgreSQL [cite:https://wiki.postgresql.org/wiki/Number_Of_Database_Connections] as opposed to 500 at a time.

The same principle applies to my legacy backend. Its essentially a database, and there is no point running more threads than there are CPU resources. I'm not sure what is the exact optimum, but 2x the number of processes plus a couple more tends to be the sweet spot.

So say we had an 8 core CPU. If I fired up 10 to 12 processes on the backend, each with its own port, could I use Nginx to load balance HTTP requests to those ports, as if they were multiple backends.

Bottom line, as per original question, can Nginx be configured to do reverse proxy to many different ports, but be forced to use those ports in a single concurrency mode?
Re: Single Concurrency
June 01, 2015 05:47PM
sacon Wrote:
-------------------------------------------------------
> So say we had an 8 core CPU. If I fired up 10 to 12 processes on the
> backend, each with its own port, could I use Nginx to load balance
> HTTP requests to those ports, as if they were multiple backends.

Yes thats possible, you would do this like you would create a pool of php-cgi, 10 of them each reacting in around 0.1 sec would give you resources of about 100 r/s.

> Bottom line, as per original question, can Nginx be configured to do
> reverse proxy to many different ports, but be forced to use those
> ports in a single concurrency mode?

That is default by design, there is no point balancing 10x 10.10.10.1:80 so either the IP's are different or the port or both.
(unless the ip:port resource can multiplex)

---
nginx for Windows http://nginx-win.ecsds.eu/
Re: Single Concurrency
June 01, 2015 06:05PM
Ok, so it sounds possible.

What I'm still not sure about is how to configure the proxy configurations, such that each port is being used in a single concurrency mode.

Do I need to set some kind of max connections for each host:port (to 1?) or will Nginx just figure this out and queue requests for that host:port?
Re: Single Concurrency
June 02, 2015 02:42AM
sacon Wrote:
-------------------------------------------------------
> Do I need to set some kind of max connections for each host:port (to
> 1?) or will Nginx just figure this out and queue requests for that
> host:port?

Not directly, see
http://forum.nginx.org/read.php?29,252517,252522
https://groups.google.com/forum/#!topic/openresty-en/NS2dWt-xHsY

For now I'd say you need to do this with Lua or write a maxconn module.

---
nginx for Windows http://nginx-win.ecsds.eu/
Sorry, only registered users may post in this forum.

Click here to login

Online Users

Guests: 124
Record Number of Users: 8 on April 13, 2023
Record Number of Guests: 500 on July 15, 2024
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready