I think everyone knows that in Google search, you can preview the site by opening its preview.

Today I will share with you one of the methods for obtaining this image.
Not so long ago, I wrote one thing (I’m going to share with you soon) for myself, and I needed to get a preview of the site, anyone.
There were several options:
- write your own thumbnail generator, for myself I understand it as - the script should open the page, take a picture of it, and save the image, but you need the whole page with the javascript worked, as I understood you need to expand the server with additional modules and other things, this method does not suit me .
- then a glance fell on ready-made solutions, offering various api resources for getting previews of sites, but being given to the will of an unknown resource doesn't even get freedom everywhere, it was decided to postpone, and later refuse, this approach
- then wordpress came to the rescue, or rather mShots service (you can read it for example
here ), but the service in any case returns an image, which does not allow you to check for its absence (I can share my decision to check the absence of the image), and it also takes some time to create a preview, and the service does not notify about it, except for how it returns in any case the gif preloader.
')
At the last method, it was decided to stop, and continue the search for a more stable method. The search took place in Google Seach and it was difficult not to pay attention to the fact that it provides a screenshot of the site (as it turned out, he gets them from his cache), and then the thought came - “And why not?”
So began the search for a way to get images from google.
The first cases looked query:

As you can see nothing joyful, a bunch of incomprehensible parameters.
I will not describe every parameter, I don’t know everything, I can only say that “j” and “b” can be safely removed, and “a” is our unique three-digit hash, which, as it turned out, is embedded in the html page markup, from where and you can get it by parsing it.
I decided to try this approach, the logic was as follows:
- we carry out a search, we need to find a site in our task, so we search for the site: http: //domain.zone
- the parsim received and find our hash and url
- we form the necessary request and we receive required
Everything seems to be logical, so I started to work.
By the will of fate and lack of time, he postponed this work and forgot about it until tonight,
and it reminded me of the resource
goo.gl , or rather the new design, and new features, including the preview of the page to which we want to get a short url. Yes, for those who do not know goo.gl provides an opportunity to get a short url.
I decided to look for information on the request “clients1.google.com/webpagethumbnail”, google seach refers to this address for taking screenshots of sites, and came across
this article, here I have already described the implemented method that I initially wanted to implement.
A little upset, and at the same time glad that there is already ready for my bun, I returned with interest to goo.gl, look here at the sent requests, and what I see:

There are no scary parameters here, there is a url api of our shortcut, and there are two parameters:
- security_token - as I understood the optional parameter (correct me if I'm wrong)
- url - and the address at which we want to get information
Everything is very clear and understandable, there are no restrictions, we try to get what we want. I wrote a small test script, on which I experimented, I got this:
header('Content-Type: text/html; charset=utf-8');
Commented out all the code so that it would be clear what I'm doing here. The file can be downloaded
here .
I will briefly describe:
- First of all, we make a request to the address api "
goo.gl/api/shorten ", in response we get a json line with information: "short_url", "long_url", "creation_time", "preview_url"
- then I check the existence of the image (because, as I understand it, the path to the image is always returned), checking the headers for status 200
- if everything is good then we give the picture, not all are well informed about it
What I like this approach more than anyone else:
- I depend only on Google, it is unlikely to just close, or stop working as a third-party resource, on the other hand, it can cover up this possibility of getting previews (but in this case I have mShots)
- This approach is faster and shorter in code than inquiries to a search engine and parse pages
- And then I can still get a short address that is a positive addition for my task.
Unfortunately, the test page did not, I'm afraid the hosting will not stand, if someone can suggest where you can place the page, I will gladly follow the advice.
That's all, I hope many will be useful information, I guess where they will immediately find many applications.
PS I’m waiting for any corrections and additions, I’m especially interested in how to improve my existing code, I apologize in advance for possible inaccuracies and especially spelling, the time of day is affected.