> It seems that search engines are probing https: even for sites that
> don't offer it
Which is fine.
> just because it's available for others, with the end
> result that pages are being attributed to the wrong site.
Sounds like an assumption. Any real life experience and
evidence backing this?
Sounds simply enough to drop the HTTPS request if the
certificate doesn't match the hostname.
Every standard wget/curl/lynx application drops the TLS session
by default in this case, I don't see why a crawler wouldn't.
> Does anyone have a better solution ( nginx of course! )
If this is a real problem (which I doubt), I guess you could just
serve a 403 Forbidden from the default hosts.
Lukas
_______________________________________________
nginx mailing list
nginx@nginx.org
http://mailman.nginx.org/mailman/listinfo/nginx