📜 ⬆️ ⬇️

Burn after reading. We make one-time links to naked Nginx

First you need to clarify that I strongly do not recommend using this solution in combat conditions. It’s better not to do that at all. Everything you do, you do at your own peril and risk. The reasons that force to give such advice will be given in the content of the article. If this warning did not scare you away, then welcome under cat.

By “naked Nginx” is meant a package for Ubuntu 16.04 from the mainline of the official repository branch, which is already compiled with the key - with-http_dav_module.

It is assumed that you already have configured nginx in the same “configuration”, therefore, the following will only describe the configuration of several locations that you add to your server section of the nginx config.

In my case, all the temporary files will be stored in the / var / www / upload folder along the path of the / random_folder_name / filename, where the random_folder_name will be a randomized line of the required number of bytes, therefore we create a location of the form:
location ~ ^/upload/([\w]+)/([^/]*)?$ { root /var/www; if ($request_method !~ ^(PUT|DELETE)$) { return 444; } client_body_buffer_size 2M; client_max_body_size 1G; dav_methods PUT DELETE; dav_access group:rw all:r; create_full_put_path on; } 

We check that the loading and deleting of files and folders works with commands in the console.
 curl -X PUT -T test.txt https://example.com/upload/random_folder_name/ curl -X DELETE https://example.com/upload/random_folder_name/ 

')
In order to protect your server from an uncontrolled stream of downloadable files, we will add a check of the token, which we will transmit with the Token header. In the config it will look like this.
 if ($http_token != "cb110ef4c4165e495001e297feae7092") { return 444; } 

The token itself can be generated in the console with a command like
 hexdump -n 16 -e '/4 "%x"' </dev/urandom 

Check again, with commands in the console, that loading and deleting files and folders works, but only if the request contains the Token header
 curl -X PUT -H "Token: cb110ef4c4165e495001e297feae7092" -T test.txt https://example.com/upload/random_folder_name/ curl -X DELETE -H "Token: cb110ef4c4165e495001e297feae7092" https://example.com/upload/random_folder_name/ 


We learned how to download and delete files, but in order to download files we will create a separate location
 location ~ ^/download/(?<folder>[\w]+)/([^/]*)$ { root /var/www; if ($request_method != GET) { return 444; } rewrite ^/download/([\w]+)/([^/]*)$ /upload/$1/$2 break; } 

Checking that receiving files works as a command in the console.
 curl https://example.com/download/random_folder_name/test.txt 

If the tests were successful, it is necessary to bring this location to a state that meets our requirements:


Now we have learned to download and safely receive, but we need to make sure that they are deleted immediately after downloading, and for this we will create a separate location
 location @delete { proxy_method DELETE; proxy_set_header Token "cb110ef4c4165e495001e297feae7092"; proxy_pass https://example.com/upload/$folder/; } 

And we will call this location from the location ~ ^ / download / ... directive
 post_action @delete; 

It looks quite decent, but, as I wrote above, I highly do not recommend using what is incomprehensible how it works and is not documented . That is why I hope that no one will use this solution "in combat"

Now everything is fine, because we can upload, download files, and after downloading they are deleted, but the received links cannot be transferred in messengers, because bots make requests for these links in the hope of getting content and generating a preview, which causes the file to be immediately removed, and the recipient observes 404 instead of the coveted file when clicking on the link.
To solve this problem, we will use the fact that we will send the recipient not a direct link to download the file, but a link to an intermediate page, and we will do it only thanks to the capabilities of the “boxed” Nginx.
First we create another location, which will give out the html file
 location ~ ^/get/(?<folder>[\w]+)/(?<file>[^/]*)$ { root /var/www; ssi on; if ($request_method != GET) { return 444; } rewrite ^(.*)$ /download.html break; } 

The most important thing about this location is the ssi on; directive. It is with the help of ngx_http_ssi_module that we will give out dynamic html, no matter how strange this phrase may sound.
Create in the folder / var / www the same file download.html with the contents of the following form
 <html> <body> After downloading this data will be destroyed <form action='/download/<!--# echo var="folder" -->/<!--# echo var="file" -->' method="get" id="download"></form> <p><button type="submit" form="download" value="Submit">Download</button></p> </body> </html> 

Now, instead of giving a direct link to the download of example.com/download/random_folder_name/filename , we will send the link to the intermediate page. A link to this page will look like example.com/get/random_folder_name/filename , when you go to it, the file will remain unharmed. to download it you will need to click on the button. And to make sure that the bots will not follow the link from this page, we’ll add a check Referer header to the location ~ ^ / download / ... so that the file will be given only if it was actually downloaded from the intermediate page
 if ($http_referer !~ ^https://example\.com/get/([\w]+)/([^/]*)$) { return 444; } 


The final config in my case is as follows
 location ~ ^/upload/([\w]+)/([^/]*)?$ { root /var/www; if ($request_method !~ ^(PUT|DELETE)$) { return 444; } if ($http_token != "cb110ef4c4165e495001e297feae7092") { return 444; } client_body_buffer_size 2M; client_max_body_size 1G; dav_methods PUT DELETE; dav_access group:rw all:r; create_full_put_path on; } location ~ ^/get/(?<folder>[\w]+)/(?<file>[^/]*)$ { root /var/www; ssi on; if ($request_method != GET) { return 444; } rewrite ^(.*)$ /download.html break; } location ~ ^/download/(?<folder>[\w]+)/([^/]*)$ { root /var/www; open_file_cache off; types { } default_type application/octet-stream; add_header Content-Disposition "attachment"; add_header X-Content-Type-Options "nosniff"; if ($request_method != GET) { return 444; } if ($http_referer !~ ^https://example\.com/get/([\w]+)/([^/]*)$) { return 444; } rewrite ^/download/([\w]+)/([^/]*)$ /upload/$1/$2 break; post_action @delete; } location @delete { proxy_method DELETE; proxy_set_header Token "cb110ef4c4165e495001e297feae7092"; proxy_pass https://example.com/upload/$folder/; } 


So that now it was convenient to use and not to drive into the console long commands to download files and folders, I sketched in .zshrc (I assume that it will work in .bashrc)
function
 upload() { if [ $# -eq 0 ]; then echo "Usage: upload [file|folder] [option] cat file | upload [name] [option] Options: gpg - Encrypt file. The folder is pre-packed to tar gzip - Pack to gzip archive. The folder is pre-packed to tar " return 1 fi uri="https://example.com/upload" token="cb110ef4c4165e495001e297feae7092" random=$(hexdump -n 8 -e '/4 "%x"' </dev/urandom) if tty -s; then name=$(basename "$1") if [ "$2" = "gpg" ]; then passphrase=$(tr -dc "[:graph:]" </dev/urandom | head -c16) echo "$passphrase" if [ "$1" = "-" ]; then name=$(basename $(pwd)) tar cf - `ls -1 $(pwd)` | gpg --passphrase-file <(echo -n "$passphrase") --batch -ac -o- | curl -I --progress-bar -H "Token: $token" -T "-" "$uri/$random/$name.tar.gpg" | grep "Location: " | cut -d " " -f2 | sed "s'/upload/'/get/'g" elif [ -d "$1" ]; then tar cf - `ls -1 "$1"` | gpg --passphrase-file <(echo -n "$passphrase") --batch -ac -o- | curl -I --progress-bar -H "Token: $token" -T "-" "$uri/$random/$name.tar.gpg" | grep "Location: " | cut -d " " -f2 | sed "s'/upload/'/get/'g" elif [ -f "$1" ]; then gpg --passphrase-file <(echo -n "$passphrase") --batch -ac -o- "$1" | curl -I --progress-bar -H "Token: $token" -T "-" "$uri/$random/$name.gpg" | grep "Location: " | cut -d " " -f2 | sed "s'/upload/'/get/'g" fi elif [ "$2" = "gzip" ]; then if [ "$1" = "-" ]; then name=$(basename $(pwd)) tar czf - `ls -1 $(pwd)` | curl -I --progress-bar -H "Token: $token" -T "-" "$uri/$random/$name.tar.gz" | grep "Location: " | cut -d " " -f2 | sed "s'/upload/'/get/'g" elif [ -d "$1" ]; then tar czf - `ls -1 "$1"` | curl -I --progress-bar -H "Token: $token" -T "-" "$uri/$random/$name.tar.gz" | grep "Location: " | cut -d " " -f2 | sed "s'/upload/'/get/'g" elif [ -f "$1" ]; then gzip -c "$1" | curl -I --progress-bar -H "Token: $token" -T "-" "$uri/$random/$name.gz" | grep "Location: " | cut -d " " -f2 | sed "s'/upload/'/get/'g" fi else if [ "$1" = "-" ]; then name=$(basename $(pwd)) tar cf - `ls -1 $(pwd)` | curl -I --progress-bar -H "Token: $token" -T "-" "$uri/$random/$name.tar" | grep "Location: " | cut -d " " -f2 | sed "s'/upload/'/get/'g" elif [ -d "$1" ]; then tar cf - `ls -1 "$1"` | curl -I --progress-bar -H "Token: $token" -T "-" "$uri/$random/$name.tar" | grep "Location: " | cut -d " " -f2 | sed "s'/upload/'/get/'g" elif [ -f "$1" ]; then curl -I --progress-bar -H "Token: $token" -T "$1" "$uri/$random/$name" | grep "Location: " | cut -d " " -f2 | sed "s'/upload/'/get/'g" fi fi else if [ "$2" = "gpg" ]; then passphrase=$(tr -dc "[:graph:]" </dev/urandom | head -c16) echo "$passphrase" gpg --passphrase-file <(echo -n "$passphrase") --batch -ac -o- | curl -I --progress-bar -H "Token: $token" -T "-" "$uri/$random/$1.gpg" | grep "Location: " | cut -d " " -f2 | sed "s'/upload/'/get/'g" elif [ "$2" = "gzip" ]; then gzip | curl -I --progress-bar -H "Token: $token" -T "-" "$uri/$random/$1.gz" | grep "Location: " | cut -d " " -f2 | sed "s'/upload/'/get/'g" else curl -I --progress-bar -H "Token: $token" -T "-" "$uri/$random/$1" | grep "Location: " | cut -d " " -f2 | sed "s'/upload/'/get/'g" fi fi } 



Cons of this solution:


UPD : Article updated on 01/18/2018. Anyone who has previously managed to set up a similar one at home, I strongly recommend making the appropriate changes, guided by the updated article.

PS: I express my gratitude to el777 , because his advice led to the fact that enlightenment descended on me, and the configs with the article were rewritten.

Source: https://habr.com/ru/post/346758/


All Articles