I will omit long and contradictory introductions, briefly introduce you to today's guest - coder undefined.
Widely known in narrow circles, undefined wrote a large number of underground software, both malicious and completely harmless and correct, facilitating the hard work of black-seoshnikov, experts in invite, as well as people involved in the promotion of quite tangible celebrities (it’s about automated content delivery to their personal pages on social networks).
Well, ask undefined a few questions.
- Tell us about the current state of protection on sites from automated processing tools, what is the practical use of them?
- Well, first of all, I want to say a few words about the well-known captcha.
For many understanding people, it has long been a secret that captcha is unable to protect practically from anything.
It can only delay the work of the program, but by no more than 6 seconds - this is the maximum time that it takes for the special staff. services to enter text from the image. With the production scale of working with software, which is not uncommon today, this delay is almost imperceptible.
Captcha is the easiest and most useless way to protect your site from the so-called bots.
')
Most often, programs have the following means of solving captcha - sending to a service, manual input, solving through an external program.
Also, one should not rely on its varieties - mathematical captcha (type 2 + 2 =?) - this is just a kindergarten and ReCaptcha - the only advantage of which is that it partially recognizes texts from certain books. Rekapcha parsitsya and unravel absolutely as well as the usual captcha and some difficulties with it can arise only from the shkoloty, or the coder, faced with it for the first time. After that, a standard algorithm of its solution is used, which is used in all other projects.
Speaking of useless defenses, I recall such a rare method as setting a secret cookie using a loaded innocent picture. It is detected at the time of the first file with sending a request, since the first thing the coder will do is compare the cookies received by the script with the cookies sent to the site. For this, a simple browser sniffer is enough.
- Good. Have you encountered overly confused defenses? Perhaps so insane that in addition to protection from spam, they also reliably protected the site from visitors?
- The main protection that protects the site is quite reliable from anyone who comes to it - this is its curvature and brakes.
Let us recall the decline in attendance of Odnoklassniki, if I am not mistaken, in the period from October 2008 to May 2009. At the same time, coders writing software for this social network were not easy to debug and fix their creations.
Dynamically changing subservers in urla, the idiotic ajax-interface, in which it was difficult to navigate even after a month of its use - all this reliably protected the site from its mass processing.
The delays of Dating.ru between sending the stalls in 20 seconds also reliably protected him from spam.
But it is doubtful that most sites will follow his example, because an increase in delays reduces the speed of user interaction with each other, and this blocks oxygen to the very heart of any social network.
Particular attention deserves Twitter.
The old interface sometimes hangs, but the new one cannot be compared. Maybe I live in the anomalous zone, where the Internet is thickening at temperatures below 20 degrees, but how damn hard it is to write software when the site regularly gets a stake! A dozen minutes can be spent on the development of the login function in one request. This is also a "good" protection, take it on a pencil.
- Well, what protection really work?
- Ban for 30 minutes for two quick logins on the same Odnoklassniki, as well as a ban on the ip for a couple of hours for checking several banned accounts in a row. Until you gain experience, this really causes problems in development. Probably, it is good to weed out a certain layer of the silly shkoloty, who read two chapters from the textbook on pkhp and immediately got the curls.
- Tell me, undefined, with what else did you have difficulties developing bots?
- With non-standard solutions.
As an example, the CNC used in Drupal, when, for example, links to pages look like this: /news-about-something.html. It is certainly more difficult, than for example node? Id = 28.
The upload of clips on YouTube is possible only through the flash-interface. And the sniffer does not see his requests, so you have to dig into wireshark. And wireshark does not see atheros laptop cards. Hence the loss of time. Although it is interesting in its own way.
Dynamic id users in Odnoklassniki also deserve special attention.
They exclude the possibility of collecting bases, make it difficult to find a particular person - everything has to be done through a search in one session, since after its completion all collected ids become invalid. Although, on the other hand, since you have to write more code, to bypass these restrictions, the software itself costs more. And classmates in general are a tasty morsel, in view of the adequacy of its users, which stand out against the background of such spam sucks like VK.
As for Google, if your software does not handle redirects when prompted automatically at the proper level - when working with it and its services, you will already have grief when logged in. Login consists of six redirects, during which some redirects are made using js and must be parsed separately. A couple of extra cookies, randomly torn from someone else's subdomain - and Google will not let you go any further. Therefore, we have to be very careful, carefully studying the logs of the sniffer.
- What can you say about the mistakes made by developers on their sites? Are there any omissions that make life easier for ordinary crimeware coders?
- Of course there are, where without them.
I will name the essence of these omissions. Perhaps someone will later discover this in his project, sniff the activity associated with them and cover the hole.
But I will not indicate already existing ones.
First, this is a trite weak server-side data check.
The interface allows the user to add three photos at a time. The script sends hundreds of photo [] fields and they all load successfully. The interface limits the user to change any data before confirming the account, and the script copes with this and without activation.
Secondly, some sites support the so-called "authorization links".
Clicking on the link - you get all the cookies of the logged in user and work with the site.
Accordingly, all filters installed on the login function go through the forest.
Thirdly, even with captcha sometimes inadmissible jambs are encountered.
A page, say an input, contains an image of a captcha and a hidden-field containing a certain hash.
We solve the captcha with our hands and in the script all the time we send the same hash + the same answer to the captcha. Voila, rega without deciphering the text from the image - this captcha is tightly connected to this hash.
Each case is unique of course, but this happens periodically.
- Wonderful. Did you find any interesting defenses in your practice? Protection, forcing a good think before you do anything.
- Sure. Any individual and complex, occupying no less than noon, protection is interesting to me. Bypassing such defenses is perhaps one of the most interesting facets of my work.
- Give examples.
- The first example, offhand - this kvip.ru.
Take a look at it and see 3 hidden fields with random names.
How the developers just didn’t trick them to confuse me!
And the left script was loaded, which changes these fields in places, and overwritten the place where the script was connected, however, it does not take into account that this page still remains in the source code of the page.
Indeed, it was nice to tinker, putting all other things aside. This delays so that you forget about food and natural needs.
Random fields at mail.ru also deserve respect.
Oh, as coders probably have complicated their lives with their verification and validation.
And I just chose them all by mask and filled them with the necessary data in order of priority.
Triple Kapcha Mile - does not take three times as much time to solve.
12 lines of code stick together three pictures into one, and then increase the contrast, for ease of recognition, and send to the service.
I had to glue together for the first time, so I was pleased that I got it simple and elegant.
Coding shadow software - is perceived by me primarily as an art.
This is not stamping gray business card sites on a dull framework, not sitting on your pants for another validation of two thousand third fields with the Name and Surname of a stupid visitor ...
- Please continue.
- Yes. About interesting defenses.
At one of the sites I met rsa-password hashing with a javascript before sending it to the server.
Unfortunately, it was not my order, they only turned to me for help, but I didn’t have enough time to process it, and the site itself was lost somewhere, but I remember that it didn’t work out with a touch. If my memory serves me, tricky operations were used there, up to a bit-wise bias, and it is not easy for a person who is not familiar with such tricky algorithms to try it in another language.
Also, the regger yahu.kom did not succumb to me.
Maybe I was still inexperienced, maybe the level of protection is really very high.
With focus in each field of the form, javascript calculates a certain checksum, which is then sent along with the data upon registration. Unfortunately, I failed to understand the principle of its calculation.
What else to say. Perhaps someone will be interested to know that the mamba has an automatic filter for uploading photos that are checked for uniqueness. In the absence of this account immediately banned.
The universal function in the set of my tools easily changes the size of a photo (+ - a few pixels), quality, contrast, file name and similar trifles. Accordingly, protection comes to naught.
- Thanks for the information, undefined.
And at the end of our humble interview, ask the last question - did you have to observe any reaction from the administrators and resource coders for whom the software was written. Still, you keep it in working condition for quite a long time, stay tuned for changes to sites.
- Coders resist, but rather sluggishly.
Because they do not pay for it. I myself worked at the office, they simply do not have such a tick in current expenses. Well, who am I telling.
Of worthy resources, I want to mention promomodija.ru. I like their site, interface, approach to the implementation of the task.
Having written my first regger for this site, in a relatively short time I registered about 400 accounts. The very next day, a limit was imposed on the number of registrations from one ip. Restriction is still present.
Sometimes, sites were protected quite ridiculously.
The form field was called message, now it is called message12. That broke the software, well done.
But this happens very rarely.
Basically, all software failures occur due to the modernization of the site.
Improving the site - and improving software. We seem to push each other forward.
Each side has its own limitations, its own advantages.
I do not have access to their code and protection, I have to act only by proven methods.
They can not make sudden movements, since any restrictions primarily affect the users of the site itself.
Real-status mamba and Live-status Loveplet.
Their effect on me as a coder is practically not felt anymore.
Customers stocked with real akkami as easily as they used to buy unconfirmed.
But on the part of the user - a completely different picture.
I tried to be "non-real" user. Only every tenth questionnaire is available, there is almost nothing to fill out in the questionnaire. The site has actually become completely un-usable, if you do not have a free mobile phone, which was not used for activation before, and you cannot receive the coveted code in the SMS.
And from the good improvements, I'm most happy for our classmates.
In recent months, they have removed the subserver from the URL, simplified the interface, removed the requirement of the first two letters of the name when searching, which greatly facilitated the collection of users from the site.
All their improvements also affected the development of software - writing became much nicer and easier.
Keep it up, guys!
- dicks and undefined / 12.2010