Once again, the problem arose of downloading relatively large files. Specifically, the client wanted to upload videos of 20-40 megabytes to the site through the admin panel. It would seem that in our enlightened time, a similar size is such a trifle, and it is a shame to talk about it. But suddenly it all came up against the settings of the virtual hosting. We were horrified to find that the maximum size of the uploaded file is 2M, and there is no possibility to change this figure. And it is impossible to change hosting for a number of reasons - at least not now.
We face the challenge - to circumvent the limitations of a poor virtual hosting. The very principle of such a workaround is obvious: the file must be cut into pieces, filled with parts, and on the server side assemble into a single whole. But this should not be done manually - the user must select the file and click on the "Send" button. How to do it?
Our very first reaction is to see the possibilities of various flash-uploaders. After all, it cannot be that world technical thought does not implement such a useful thing as downloading a file in parts. We iterate through Uploadify, SWFUpload, FancyUpload, jqUploader, jquery-transmit. But all in vain. We do not see the desired feature. It is likely that we need to dig further, but time is running out, and we have to do something already ...
')
The above is sad. However, the fact that this is an admin panel is playing into our hands. Those. we do not need to focus on cross-browser compatibility. It is enough that this mechanism will work on the client's browser, which (lo and behold!) Is FF.
We also recall that in the latest versions of FF, it is possible to get the contents of a file loaded in the file-upload field into a line. And the desire to break this line into pieces and pump in parts using Ajax comes to mind.
Client part
First, draw the necessary in static HTML:
<form> <input type="file" id="myfile"> </form> <a href="#" onClick="big_file_upload($('#myfile'))"></a>
Those. when you click on the link, the big_file_upload function should be called, into which the object is transferred, from which you need to take the contents of the file. Note the $ ('# myfile') construct. I think there is no need to dwell on the need to connect the jQuery library, which we will also use for ajax requests when transferring a file to the server.
Now we need to write the same function big_file_upload:
var upload_chunk_size = 120000;
// In this variable is the size of the pieces into which we will divide the file.
function big_file_upload (file) {
// .....
}
Retrieving file contents
To get the contents of the file we will use the following construction:
var data = file.get (0) .files.item (0) .getAsDataURL ();
I explain its meaning:
- file.get (0) - getting a DOM object from a jQuery object passed to a function
- files.item (0) - getting the first file from the list. Here we have it only, however, let me remind you that there is already the possibility of multiple downloading files from one control.
- getAsDataURL () - getting the contents of a file in the Data: URL format. There are also methods getAsText and getAsBinary, however we need to transfer to the server using the POST method, therefore it is desirable to get the contents of the file encoded in Base64.
Similar construction get the file name:
var filename = file.get (0) .files.item (0) .fileName;
Since the content is in the Data: URL format, it would be nice to cut off the header part, which contains information about the MIME type and the encoding method. In a more general version of our function, this information should be used, but in this example, it will only get in our way when decoding. Therefore, simply cut off all the first comma (inclusive), which is separated by the title:
var comma = data.indexOf (',');
if (comma> 0) data = data.substring (comma + 1);
Sending a file in chunks
Here everything is trite:
var pos = 0;
while (pos <data.length) {
$ .post ('/ upload.php', {
filename:
chunk: data.substring (pos, pos + upload_chunk_size)
});
pos + = upload_chunk_size;
}
Server part
Now we will do on the server the receiving PHP upload.php script. As an example, it is also extremely simple:
$ filename = $ _POST ['filename'];
$ f = fopen ("/ dir / to / save / $ filename", "a");
fputs ($ f, base64_decode ($ _ POST ['chunk']));
fclose ($ f);
The file opens with the “a” option, i.e. it is suggested not to overwrite the existing file, but to supplement it. In this way, we will assemble a whole file from the pieces.
I think everyone understands that this script should lie in the closed admin part of the site so that different personalities do not have access to it. In addition, the file name and the file itself should be checked for validity. Otherwise, it’s not even a vulnerability, but ... I don’t know such words ...
We try to run
Have tried? Happened? I bet that it turned out not at all what was expected. The file seems to be downloaded. It seems to be even the correct length. But the contents - some kind of porridge.
Why did this happen? The answer is simple: POST requests are sent in asynchronous mode in several pieces at a time, so no one guarantees that they will be sent to the server exactly in the sequence in which sending commands were sent. Not only does no one guarantee, but I downright claim that they will never come in the right order. There will always be porridge.
Therefore, as it is sad, it is necessary to disconnect asynchrony. Before sending the file, do the following:
$ .ajaxSetup ({async: false});
Now everything is fine. The file is assembled in the correct sequence, but at the time of downloading the execution of other scripts is suspended. Therefore, in order not to irritate the user, it would be nice to show percentiles or a progress bar somewhere. Thus, a workable script looks like this:
var upload_chunk_size = 120000; // Size of the piece
function big_file_upload (file) {
var data = file.get (0) .files.item (0) .getAsDataURL (); // Get the file contents
var filename = file.get (0) .files.item (0) .fileName; // Get the file name
var comma = data.indexOf (',');
if (comma> 0) data = data.substring (comma + 1); // Cut off the header Data :: URL
var pos = 0;
$ .ajaxSetup ({async: false}); // Disable Asynchrony
while (pos <data.length) {
$ .post ('/ upload.php', {// Send POST
filename: // Filename
chunk: data.substring (pos, pos + upload_chunk_size) // chunk file
});
pos + = upload_chunk_size;
var p = Math.round (pos * 100 / data.length); // Calculate the percentage sent
$ ('# progress'). text (p + '%'); // Draw a number to keep the user calm
}
}
disadvantages
As without them ...
- In this example, it is assumed that the getAsDataURL method always returns base64 encoded data. In fact, I would not bet that it will always be so. In an amicable way, the title should not be thrown out, but transferred to the server part, which, in turn, should be taught to process data encoded in different ways.
- A file sent two times will be recorded on the server two times. And complement himself. To avoid this, apparently, you need to transfer in addition to the name also some kind of unique identifier of the download. But this, generally speaking, is a matter of the method of generating and transmitting the file name. There is no universal recipe, and there can not be.
- The client script runs for a long time (depending on the file size and the thickness of the channel), and FF may even ask if you are sure that you have to wait for the end, or kill it so that it does not suffer?
- Cross-browser compatibility. Getting the contents of the file, alas, only works on FF. Tested on 3.0, 3.5 and 3.6. Earlier it was not checked for the lack of those at hand. FF developers themselves recommend using FileAPI instead of this method, but it appeared only in 3.6.
- Really large files (hundreds of megabytes, gigabytes) download in this way will not work. The limit depends on the amount of memory available to the browser.
What to do?
Probably need to try all the same in asynchronous mode. For each piece to transfer more and information about its location within the file. In this case, the server part is seriously complicated.