Answers intermixed below.
On Wed, 11 Mar 2020 21:23:15 -0400
"MAXMAXarena" <nginx-forum@forum.nginx.org> wrote:
> Hello @Ralph Seichter,
> what do you mean by "mutually exclusive"?
> As for the tools I mentioned, it was just an example.
> Are you telling me I can't solve this problem?
>
>
> Hello @garic,
> thanks for this answer, it made me understand some things. But I
> don't think I understand everything you suggest to me.
>
> Are you suggesting me how to make the link uncrawlable, but how to
> block direct access?
If you are going to block access to a file, you should protect it from
the shall we say the less sophisticated user right down to the
unrelenting robots.txt ignoring crawler.
Most of the webmasters want to be crawled, so they have posted what
stops the crawlers from reaching your files. Things like ajax used to be
a problem. Over the years the crawlers have become smarter. But you can
search blogs for crawling problems and then do just the opposite of
their suggestion. That is make your file hard to reach.
In other words, this is a research project.
>
> For example, if the user downloads the file, then goes to the download
> history, sees the url, copies it and re-downloads the file. How can I
> prevent this from happening?
>
> Maybe you've already given me the solution, but not being an expert,
> i need more details if it's not a problem, thanks.
>
In the http block, "include" these files which contain maps. You will
have to create these files.
---------------------
include /etc/nginx/mapbadagentlarge ;
include /etc/nginx/mapbadrefer ;
include /etc/nginx/mapbaduri;
-------------------------------------------------
Here is how I use maps. First in the location at the webroot:
---
location / {
index index.html index.htm;
if ($bad_uri) { return 444; }
if ($badagent) { return 403; }
if ($bad_referer) { return 403; }
-------
403 is forbidden, but really that is found and forbidden. 444 is no
reply. Technically every internet request deserves an answer but if
they are wget-ing or whatever I hearby grant permission to 444 them (no
answer) if you want.
A sample mapbaduri file follows. Basically place any word you find
inappropriate to be in the URL in this file. You need to use caution
that the words you put in here are not containined in a URL that is
legitimate. Incidentally these samples are real life.
---------------------------------
map $request_uri $bad_uri {
default 0;
~*simpleboot 1;
~*bitcoin 1;
~*wallet 1;
}
---------------------------------------------
Next up and more relevant to your question is my mapbadagentlarge file.
This is where you trap curl and wget. There are many lists of bad
agents online. The "" seems to trap those lines with no agent.
-------------------------------------
map $http_user_agent $badagent {
default 0;
"" 1;
~*SpiderLing 1;
~*apitool 1;
~*pagefreezer 1;
~*curl 1;
~*360Spider 1;
}
----------------------
The mapbadrefer is up to you. If you find a website you don't want
linking to your website, you make a file as follows:
----------------------------------------
map $http_referer $bad_referer {
default 0;
"~articlevault.info" 1;
"~picooly.pw" 1;
"~pictylox.pw" 1;
"~imageri.pw" 1;
"~mimgolo.pw" 1;
"~rightimage.co" 1;
"~pagefreezer.com" 1;
}
-----------------------------------------------
Note that as you block these website you will probably loose google
rank.
> I found this stackoverflow topic that is interesting:
> https://stackoverflow.com/questions/9756837/prevent-html5-video-from-being-downloaded-right-click-saved
>
> Read the @Tzshand answer modified by @Timo Schwarzer with 28 positive
> votes, basically it's what I would like to do, but in my case they
> are pdf files and I use Nginx not Apache.
>
Right click trapping is pretty old school. If you do trap right clicks,
you should provide a means to save desired links using a left click. I
don't know how to do that but I'm sure the code exists.
Don't forget the dynamic url. That prevents the url from being reused
outside of the session. I've been on the receiving end of those
dynamic URLs but never found the need to write one. So that will be a
research project for you.
I'm in the camp that you can probably never perfectly make a file a
secret and yet serve it, but you can block many users. It is like you
can block the script kiddie, but a nation state will get you.
> Posted at Nginx Forum:
> https://forum.nginx.org/read.php?2,287297,287302#msg-287302
>
> _______________________________________________
> nginx mailing list
> nginx@nginx.org
> http://mailman.nginx.org/mailman/listinfo/nginx
_______________________________________________
nginx mailing list
nginx@nginx.org
http://mailman.nginx.org/mailman/listinfo/nginx