Welcome! Log In Create A New Profile

Advanced

uneven load distribution in container

Posted by BharathShankar 
uneven load distribution in container
March 27, 2019 02:12AM
I am trying out the sample gRPC streaming application at https://github.com/grpc/grpc/tree/v1.19.0/examples/python/route_guide.

I have modified the server to start 4 grpc server processes and correspondingly modified the client to spawn 4 different processes to hit the server.
Client and server are running on different VMs on google cloud. I have attached the client and server programs along with the nginx.conf file.

Problem Statement
--------------------------
Nginx load balancing (round robin) is uneven when run inside a container where as it is uniform when nginx is run directly inside a VM.

Scenario 1 : Server running in VM
----------------------------------------------
$ python route_guide_server.py
pid 14287's current affinity list: 0-5
pid 14287's new affinity list: 2
pid 14288's current affinity list: 0-5
pid 14288's new affinity list: 3
pid 14286's current affinity list: 0-5
pid 14286's new affinity list: 1
pid 14285's current affinity list: 0-5
pid 14285's new affinity list: 0
Server starting in port 9003 with cpu 3
Server starting in port 9002 with cpu 2
Server starting in port 9001 with cpu 1
Server starting in port 9000 with cpu 0

Now I run the client on a different VM.
$ python3 route_guide_client.py
...........
.......

On the server we see that the requests are uniformly distributed between all 4 server processes running on different ports. For example the output on server for above client invocation is

Serving route chat request using 14285 << These are PIDs of processes that are bound to different server ports.
Serving route chat request using 14286
Serving route chat request using 14287
Serving route chat request using 14288


Scenario 1 : Server running in Container
------------------------------------------------------
I now spin up a container on the server VM, Install and configure nginx the same way inside the container and use the same nginx config file except for then nginx server listen port.

$ sudo docker run -p 9999:9999 --cpus=4 grpcnginx:latest
...............

root@b81bb72fcab2:/# ls
README.md docker_entry.sh media route_guide_client.py route_guide_pb2_grpc.py route_guide_server.py.bak sys
__pycache__ etc mnt route_guide_client.py.bak route_guide_pb2_grpc.pyc run tmp
bin home opt route_guide_db.json route_guide_resources.py run_codegen.py usr
boot lib proc route_guide_pb2.py route_guide_resources.pyc sbin var
dev lib64 root route_guide_pb2.pyc route_guide_server.py srv

root@b81bb72fcab2:/# python3 route_guide_server.py
pid 71's current affinity list: 0-5
pid 71's new affinity list: 0
Server starting in port 9000 with cpu 0
pid 74's current affinity list: 0-5
pid 74's new affinity list: 3
Server starting in port 9003 with cpu 3
pid 72's current affinity list: 0-5
pid 72's new affinity list: 1
pid 73's current affinity list: 0-5
pid 73's new affinity list: 2
Server starting in port 9001 with cpu 1
Server starting in port 9002 with cpu 2

On the client VM

$ python3 route_guide_client.py
............
..............

Now on the server we see that the requests are only served by 2 ports/processes.

Serving route chat request using 71
Serving route chat request using 72
Serving route chat request using 71
Serving route chat request using 72


Requesting help to resolve load distribution problem inside container.



Edited 1 time(s). Last edit at 03/27/2019 02:13AM by BharathShankar.
Attachments:
open | download - nginx.conf.conf (1 KB)
open | download - route_guide_client.py (4.3 KB)
open | download - route_guide_server.py (4.9 KB)
Sorry, only registered users may post in this forum.

Click here to login

Online Users

Guests: 134
Record Number of Users: 8 on April 13, 2023
Record Number of Guests: 421 on December 02, 2018
Powered by nginx      Powered by FreeBSD      PHP Powered      Powered by MariaDB      ipv6 ready