📜 ⬆️ ⬇️

Realization of the ability to download directories by site users

There is a small closed site on which music is laid out with albums and users of the site have the opportunity to listen to these albums straight from the browser. On the server, the albums are stored in the form of directories, inside which the musical compositions themselves are stored, which at the request of the player are given to nginx.
Everything was fine until the users wanted to download their favorite albums entirely to their computers.

Under a cat I will tell as we implemented it.

The first decision that comes to mind is to create an archive of each album and give it to the user.
The main reason for which we abandoned this option: we have a lot of music and the creation of an archive for each album means that the amount of disk space needs to be multiplied by two - as long as we cannot afford it.

It seemed to us that it would be ideal to form an archive of the album on request and on the fly. If during the creation of the archive to disable compression, then the overhead should be small, well, and the costs themselves are not particularly critical for us - at the moment we have a large supply of processor time and RAM, as opposed to permanent storage.
')
The search for a module for nginx , or any other ready-made solutions, did not give any result - we will implement it ourselves - the task is not difficult.

For archiving, we will use the standard zip program, without compression, and we will not write data to the file, but send it directly to stdout . For such a goal it would be more logical to use tar , but we have ordinary users and for many it will be a mystery what to do with the tar or tar.gz archive.

Now we need to somehow transmit the compressed data to nginx, which will give it to the user.
For these purposes, a simple cgi script was written on the shell:
#!/bin/bash #  ,              homeDir="/storage/media/audio" #       ,     downloadDir=$(echo $QUERY_STRING | sed -f urldecode.sed) #     pushd "$homeDir" > /dev/null #   ,     -      zip       if [ -d $downloadDir ] && [ ! -z $downloadDir ] && [[ "$downloadDir" != *\/* ]] && [[ "$downloadDir" != *\\* ]] then echo "Content-Type: application/octet-stream" echo "Content-Disposition: attachment; filename=$downloadDir.zip" echo "" /usr/bin/zip -r -0 - "$downloadDir" else echo "Status: 404 Not Found" echo "Content-Type: text/html" echo "" echo "<h1>404 File not found!</h1>" fi #     popd > /dev/null 


The script is simple, I think it would be unnecessary to comment further on it.

Since the directory name comes url-encoded, we decode it, for this we use sed and a small script for it, we take the script here and put it next to our cgi-script.

Everyone knows that nginx supports FastCGI and does not support CGI , so that our script still works, we will use Fcgiwrap .

We put:
 apt-get install fcgiwrap 


And configure:
 # support for cgi scripts (require fcgiwrap) location ~ \.cgi$ { gzip off; try_files $uri =404; # pass scripts to fcgiwrap fastcgi_pass unix:/var/run/fcgiwrap.socket; #    include /etc/nginx/fastcgi_params; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_ignore_client_abort off; } 


Restart nginx and you can use: /download.cgi? download_name

Source: https://habr.com/ru/post/194904/


All Articles