On 11/29/2016 09:55 AM, Lukas Tribus wrote:
>> It seems that search engines are probing https: even for sites that
>> don't offer it
> Which is fine.
>
>
>
>> just because it's available for others, with the end
>> result that pages are being attributed to the wrong site.
> Sounds like an assumption. Any real life experience and
> evidence backing this?
yes
> Sounds simply enough to drop the HTTPS request if the
> certificate doesn't match the hostname.
>
> Every standard wget/curl/lynx application drops the TLS session
> by default in this case, I don't see why a crawler wouldn't.
>
>
>
>> Does anyone have a better solution ( nginx of course! )
> If this is a real problem (which I doubt), I guess you could just
> serve a 403 Forbidden from the default hosts.
>
>
> Lukas
>
Not sure why you're doubting me here Lukas. Yes, this is a problem. No
I'm not making it up.
My understanding is that SSL is sorted before handing over to manage the
content. As such, an incorrect or missing cert will fail, and a missing
https server block will be handled by the default one ( or the one
alphabetically first if not set ).
So, as stated above this is a real problem. Sure in hindsight it's a
configuration shortfall / cockup, but it didn't occur to me that search
engines would be attempting to force https.
Steve
--
Steve Holdoway BSc(Hons) MIITP
http://www.greengecko.co.nz
Linkedin: http://www.linkedin.com/in/steveholdoway
Skype: sholdowa
_______________________________________________
nginx mailing list
nginx@nginx.org
http://mailman.nginx.org/mailman/listinfo/nginx