# video | Hackers hacked Tesla autopilot non-standard methods

In 2016, a group of researchers in the field ofsecurity Keen Security Lab successfully hacked the car system Tesla Model S, after which the company has significantly improved its reliability. In 2019, the hackers decided to identify the weaknesses of the autopilot, and not using software hacking, but using visual traps. Researchers did indeed succeed, but Tesla did not appreciate their efforts and published several statements that ensured good car security.

According to researchers, providingunmanned motion system Advanced Driver Assistance System can be deceived even with the help of visual traps. In the first experiment, the hackers demonstrated that the car wipers can be activated externally using a specially crafted image on the big screen TV.

The opening did not have on the representatives of Teslaspecial experience. They noticed that the TV should be in close proximity to the windshield of the car - hardly anyone would carry a huge screen with them in order to simply turn on the glass cleaning. Moreover, users can disable the automatic activation of the "janitors", and turn them on strictly manually. Therefore, the detected vulnerability is not considered dangerous.

</ p>

The second attempt to deceive the autopilot wascarried out with the help of pasted on the road labels that can confuse the system. Having drawn an extra point on the marked road, hackers really changed the direction of the car - it moved into the oncoming lane.

Tesla had the answer to this discovery. According to representatives, the owners of Tesla cars must be ready to take the wheel even during the included autopilot. Therefore, if hackers succeed in deceiving the car with the help of a marking, the driver will be able to return to the right direction or slow down.

Do you agree with Tesla's opinion, or are the found vulnerabilities dangerous? Write your opinion in the comments, and you can discuss the reliability of automatic driving systems in our Telegram-chat.