They claim to obey robots.txt. They also claim to to use consecutive IP addresses.
https://www.semrush.com/bot/
Some dated posts (2011) indicate semrush uses AWS. I block all of AWS IP space and can say I've never seen a semrush bot. So that might be a solution. I got the AWS IP space from some Amazon Web page.
I get a bit of kick back about blocking things that are not eyeballs like colos and VPS, but it works for me. I only block after seeing a hacking attempt(s) in my logs.
Original Message
From: Grant
Sent: Wednesday, December 14, 2016 10:31 AM
To: nginx@nginx.org
Reply To: nginx@nginx.org
Subject: Re: limit_req per subnet?
> I am curious what is the request uri they was hitting. Was it a dynamic page
> or file or a static one.
It was semrush and it was all manner of dynamic pages.
- Grant
_______________________________________________
nginx mailing list
nginx@nginx.org
http://mailman.nginx.org/mailman/listinfo/nginx
_______________________________________________
nginx mailing list
nginx@nginx.org
http://mailman.nginx.org/mailman/listinfo/nginx