Scenario 1:
With long-polling requests, each client uses only one port since the same
connection is continuously used, HTTP being stateless. The loss of the
connection would mean potential loss of data.
32K simultaneous active connections to the same service on a single
machine? I suspect the bottleneck is somewhere else...
Scenario 2:
So you would use several backend and a single frontend? Frontend,
espacially when only used as proxy/cache, are the easiest components to
replicate...
Once again, I highly suspect that managing 32K connections on a single
server is CPU-consuming...
I am no among the developers at all... I am merely discussing the
usefulness of such a request.
I prefer developers to concentrate on usable stuff rather than on
superfluous features: the product will be more efficient and usage-based
and not an all-in-one monster.
My 2 cents.
I'll stop there.
---
*B. R.*
_______________________________________________
nginx mailing list
nginx@nginx.org
http://mailman.nginx.org/mailman/listinfo/nginx