⬆️ ⬇️

Return small graphics

Holivary about where to store pictures - in the database or in the file system - not a rare thing even for Habr. There is no unambiguous approach here and there can be no, but if you look at the situation from the position of optimizing content delivery, then a reasonable compromise becomes a little more obvious, in my opinion.



It is clear that there is no sense to store large images in the database. There are many reasons for this, and all of them have been known to everyone for a long time; we have not yet considered FileStream. However, the task often arises to give a large number of small things from one page — for example, avatars on blogs or social networks, picture previews for news in tapes, etc., i.e. relatively light graphics, but in commercial quantities. The trouble is that the return of each such picture creates a separate connection to the server, as a result, loading one page generates several dozen requests for the return of any small file, the total amount of which on the page can not exceed hundreds of kilobytes. For current channels, this is nothing, but a bunch of requests to the server when rendering the page is bad.



Quite interesting is the decision, in which all this trifle pot-bellied will be given in one stream. Thus, we reduce the number of server requests by an order of magnitude, which means we increase the site’s resistance to DDoS attacks. As a nice bonus - the prospect of usability to be pumped: all the pictures will appear on the page immediately (without the effect of “loading”), at the same time and without the “holes”. A small pause before their appearance does not count, subjectively, such a load looks “faster”.

')

It looks like this: in the text of the HTML file placeholders are placed with IDs corresponding to the maps in the database. Javascript “runs through” placeholders and forms an XMLHTTP query string, in response to which the server returns an array of base64-encoded strings, which, again, javascript distributes src corresponding placeholders.



As a result, instead of receiving a couple of dozens of individual images weighing 2-5kB each, we get a JSON line with a volume of 50-100kB for one asynchronous request. That is, a) already AFTER the main text content is loaded and rendered, and b) much faster due to the features of text data streaming.



Working example (avatars nadergany from the network).



The code can be viewed in the source page. The PHP servlet (works with SQLite) looks like this:



<? ob_start(); $dbh = sqlite_open('data/img', 0666, $sqliteerror); $sth = sqlite_query($dbh, "SELECT *, ROWID AS id FROM img WHERE ROWID IN (".trim($_GET['q']).") LIMIT 20"); $res = sqlite_fetch_all($sth, SQLITE_ASSOC); echo json_encode($res); sqlite_close($dbh); ob_end_flush(); ?> 




What do you say, quite a stupid idea? Or did I invent the bicycle?

Source: https://habr.com/ru/post/157661/



All Articles