Hi,
I have been reading the documentation and also searching this forum for a while, but could not find an answer to my question.
Currently, I have a 2 NGINX nodes acting as a reverse proxy (in a failover setup using keepalived). The revproxy injects an authentication header, for an online website (transport is https).
As the number of users grows, the load on the current machine starts to get uncomfortably high and I would like to be able to spread the load over both nodes.
What would be the best way to set this up?
I already tried adding both IP addresses to the DNS. But this, rather predictably, only sent a handful of users to the secondary node.
I now plan about setting up an NGINX node in front of these revproxy nodes, acting as a round-robin load balancer. Will this work? Given the fact that traffic is over HTTPS, terminating the request will probably put all the load on the load balancer and therefore does not solve my issue.
Your advice and help is greatly appreciated.