📜 ⬆️ ⬇️

Protection against DDoS attacks at the level of web applications

As you know, DDoS attacks on a site are of varying intensity, the number of hosts participating in the attack, the number of network packets and the amount of data transmitted are important. In the most severe cases, it is possible to repel an attack only by using specialized equipment and services.


If the attack volume is less than the bandwidth of the network equipment and the computing capacity of the server (server pool) serving the site, then you can try to “silence” the attack without resorting to third-party services, namely to turn on the program filter of traffic coming to the site. This filter will filter out the bots traffic involved in the attack, while skipping the legitimate traffic of “live” visitors to the site.


The scheme of the program filter from DDoS attacks on the site


The filter is based on the fact that bots participating in DDoS attacks are not able to execute JavaScript code, respectively, bots will not go further than the stop page of the filter than significantly relieve the frontend / backend and database of the site. Since To process each GET / POST request for DDoS attacks, you will need to run no more than 20 lines of code in the site backend and issue a stub page with a volume of less than 2KB of data.



  1. We call the filter in the first line of the web application, before calling the rest of the application code. So you can maximally unload the server’s hardware and reduce the amount of traffic sent to the bots.
  2. If the visitor is subject to the filter conditions, then we give the visitor a special stub page. On the page,
    • We inform about the reasons for issuing a special page instead of the requested
    • We install a special cookie in the user's browser using JavaScript.
    • We execute javascript redirect code to the original page.
  3. If the visitor has a special cookie, the filter transparently passes the visitor to the requested page of the site.
  4. If the visitor's IP address belongs to an autonomous system from the list of exceptions, then we also transparently pass traffic. This condition is necessary to exclude filtering bots search engines.

Filter project on github.com .


Synthetic Filter Tests


We tested it with the ab utility from the Apache Foundation on the main page of the combat site, after removing the load from one of the nodes.


Results with filter disabled,


ab -c 100 -n 1000 https://cleantalk.org/ Total transferred: 27615000 bytes HTML transferred: 27148000 bytes Requests per second: 40.75 [#/sec] (mean) Time per request: 2454.211 [ms] (mean) Time per request: 24.542 [ms] (mean, across all concurrent requests) Transfer rate: 1098.84 [Kbytes/sec] received 

Now the same with the filter on,


 Total transferred: 2921000 bytes HTML transferred: 2783000 bytes Requests per second: 294.70 [#/sec] (mean) Time per request: 339.332 [ms] (mean) Time per request: 3.393 [ms] (mean, across all concurrent requests) Transfer rate: 840.63 [Kbytes/sec] received 

As can be seen from the test results, the inclusion of a filter allows the web server to process almost an order of magnitude more requests than without a filter. Naturally, we are talking only about requests from visitors without JavaScript support.


The use of the filter in practice, the history of saving the site from one small DDoS attack


From time to time we encounter DDoS attacks on our own corporate website https://cleantalk.org . Actually during the last of the attacks, we applied a filter from DDoS at the level of the web application of the site.


Attack start


The attack began at 18:10 UTC + 5 On January 18, 2018, they attacked GET with URL requests https://cleantalk.org/blacklists . On the network interfaces of the Front-end servers, an additional 1000-1200 Kbps of incoming traffic appeared, i.e. received a load of 150 / second GET requests to each server, which is 5 times higher than the regular load. As a result, the load average of Front-end servers and database servers sharply increased. As a result, the site began to issue an error 502 due to the lack of free php-fpm processes.


Attack analysis


Having spent some time studying the logs, it became clear that this is a DDoS attack, because,





Accordingly, it was decided to enable the filter of site visitors according to the algorithm described above, additionally including checking incoming traffic on our database of blacklists , thereby reducing the likelihood of issuing a stop page to legitimate site visitors.


Enable filter


After spending some time preparing the filter, at 19: 15-19: 20 it was turned on.



After a few minutes, the first positive results were obtained, first Load average returned to normal, then the load on the network interfaces fell. A few hours later, the attack was repeated twice, but its consequences were almost imperceptible, the frontends worked without errors 502.


Conclusion


As a result, using the simplest JavaScript code, we solved the task of filtering traffic from bots, thereby extinguishing the DDoS attack and returning the site accessibility indicators to a regular state.


Honestly, this bot filtering algorithm was not invented on the day of the attack described above. A few years ago, we implemented the additional SpamFireWall function to our Antispam service, SpamFireWall is used by more than 10 thousand websites and there is a separate article about it.


SpamFireWall was designed primarily to combat spam bots, but since the lists of spam bots overlap with the lists of other bots used for dubious purposes, the use of SFW is quite effective, including for stopping small DDoS attacks on the site.


About CleanTalk Service


CleanTalk is a cloud service to protect websites from spambots. CleanTalk uses protection methods that are invisible to website visitors. This allows you to abandon the methods of protection that require the user to prove that he is a person (captcha, question-answer, etc.).


')

Source: https://habr.com/ru/post/354374/


All Articles