Security researchers were able to gain remote control over the autopilot system of the Tesla Model S car and control it using the game joystick. Thus, they drew attention to the potential security problems of modern driver assistance systems (Advanced Driver Assistance Systems, ADAS), the task of which is precisely to increase the safety of who is driving.Researchers from Tencent Keen Security Lab successfully activated the Tesla Autopilot autopilot system of the Tesla car, gaining control of it, which was reported in a
new publication , which describes in detail the details of the study.
The group, which had previously demonstrated its research at the Black Hat USA 2018 security conference, published a
video showing hacking. The new report describes three ways to gain control over the autopilot system of the car by using several vulnerabilities in the electronic control unit (
eng. - electronic control unit, ECU ).
Researchers identify three main achievements in hacking into the autopilot system, ECU version 18.6.1. First, using an error in the pattern recognition system of automatic wipers, they activated windshield wipers. Second, by placing interfering stickers on the road that deceived the route recognition system, they forced Tesla to maneuver into oncoming traffic. Thirdly, they were able to remotely control the driving even if the autopilot was not activated by the driver.
')
“Thus, it was proved that by slightly changing the physical environment, we can control the car to a certain extent without connecting to it remotely,” the researchers conclude in the report. ”We hope that the potential defects revealed by the series of tests will attract attention manufacturers, which will improve the stability and reliability of their self-propelled machines. "
Risks of progress
Of course, the researchers say that they notified Tesla after the successful autopilot system was compromised and, according to Tencent, Tesla "immediately corrected" a number of errors.
Researchers at Tencent Keen Security Lab have been able to compromise the autopilot system of a perfect system of driver assistance for the Tesla Model S. (Source: Tesla)Regardless, the study demonstrates the relentless danger that hackers potentially use of the openness and intelligence of modern cars as a ground for attacking; for the first time this opportunity was vividly demonstrated in the
2015 hack of a Jeep Cherokee , published in Wired.
"The average modern car contains hundreds of sensors and a lot of on-board computers, each of which is potentially vulnerable to physical, software and / or logical attack," said Jerry Gamblin, lead engineer for security intelligence, Kenna Security, in an interview with Security Ledger . "This fact creates an amazing ground for attacks, which car manufacturers must prevent, and also creates a vast target field for potential intruders."
Since breaking into the Jeep, cars have become even more complex. Thus, in the automotive industry, ADAS technologies like Tesla Autopilot are rapidly developing.
These systems should enhance the driver’s capabilities and provide the car with intelligent safety systems, such as collision avoidance systems, in order to increase safety. At the same time, the increased complexity makes such systems potentially destructive when they are compromised, which casts doubt on the safety of using ADAS technologies.
Privileges equal control
Researchers at Keen Security Labs stated that they used ROOT powers (
obtained by connecting remotely by exploiting a number of vulnerabilities — approx. Transl. ) In implementing the most intimidating part of their hacking — intercepting control of the Tesla control system in a “contactless way” write. The researchers used the privileges to send control commands to the autopilot while driving.
The possibility of influencing the windshield wipers and the track control system was achieved due to an improved optimization algorithm used in creating the so-called "opposing elements", which were fed to the input of the corresponding vehicle systems.
Both the windshield wipers and the road recognition system make their decisions based on camera data, as the researchers found out. Thus, it was not very difficult to deceive them, forcing them to “see” conditions that actually did not exist.
The researchers achieved this by sending images to the neural network of windshield wipers and modifying road markings in the case of a road recognition system. In both experiments, the system responded to what it “saw” instead of real road conditions.
Competitor models are also closely monitored. While more and more systems rely on machine learning, more researchers are looking for ways to influence their work by
supplying false data to the input .
Tesla Answer
In his blog Tencent Keen published a response to hacking Tesla, which, surprisingly, wore a pronounced defensive nature. The company rejected the compromise of wipers and road recognition systems due to the fact that they “do not happen in real life” and, thus, should not be a cause for concern for drivers.
In their response, Tesla also stressed that drivers, if desired, can turn off the system of automatic wipers. In addition, they have the option of “switching to manual control using the steering wheel or brake pedal and must be constantly prepared to do this,” especially if there is a suspicion that the system is not working correctly.
Speaking about the use of ROOT privileges when intercepting driving, Tesla reminded the researchers that the company fixed the main vulnerability described in the report when updating the security system in 2017 and the subsequent extensive system update last year (the
vulnerability was fixed in software version 2018.24 - approx. transl. ) Moreover, according to Tesla's response, both of these updates were available even before the Tencent Keen Security Lab told the company about their research, Tesla said.
“In the many years that we have been making cars on the roads, we have never seen a single consumer who has been the victim of any vulnerability presented in the report,” they add to the company.
Leaving aside the protests of the company, security experts are still not sure that ADAS systems, like Tesla Autopilot, will not cause chaos and will not cause damage if they fall under the control of intruders. “Manufacturers should take this into account when developing new systems,” said Jerry Gamblin.
“It is necessary to concentrate most of the attention on ensuring the safety of systems that can cause serious damage to consumers and other passengers in the event of compromise,” the expert advised. "Producers must allocate finances optimally and respond correctly to any difficulties that arise from attacks on secondary systems and may affect the end user, who in no case should be exposed to danger."