Tesla cars are good not only because they areequipped with nature-friendly electric motors, but also with support for automatic driving. By paying $ 6,000 when buying a car, people get the opportunity to use autopilot. This means that while driving, a person can remove their hands from the steering wheel and provide control to the built-in computer, which assesses the environment using sensors installed in the car and makes all decisions independently. At the moment, Tesla's autopilot copes well with changing lanes and other simple tasks, but you cannot completely trust the driving process. However, some drivers completely ignore this fact and instead of keeping an eye on the road, they are engaged in all sorts of nonsense. Recently, the irresponsibility of one of the owners of a Tesla car led to a police accident. When clarifying the circumstances, it turned out that the driver did not look at the road at all.
Tesla autopilot error
According to the publication Ars Technica, another accidentinvolving a Tesla car took place in the US state of North Carolina. A police car was parked on the side of one of the roads, and the police themselves were busy clarifying the circumstances of another traffic accident. A Tesla car was also driving along the road with autopilot on, but the driver himself did not follow what was happening on the road at all. This was the reason that Tesla rammed a police car and almost killed people. Fortunately, no one was seriously injured. In the course of clarifying the circumstances, the Tesla owner admitted that he completely entrusted the driving to the autopilot and watched the movie on his smartphone. Of course, a man will certainly be punished for his irresponsibility.
This accident is just another reminderthat there are currently no fully autonomous vehicles. Drivers have to keep an eye on the road at all times, no matter how many thousands of dollars they paid for the autopilot system and other assistive technologies. Crashes due to autopilot errors are not uncommon. It is reported that a similar accident in July 2020 occurred in the state of Arizona. We have already written about the incidents of past years. For example, here is the material by Hi-News.ru author Ilya Khel about the consequences of one of such accidents for Tesla.
Tesla's autopilot makes all kinds of mistakes, butthe collision with a police car is due to the fact that the automatic driving system did not recognize the vehicle at the side of the road. This problem is relevant not only for the Tesla autopilot - previously an experiment showed that BMW, Kia and Subaru cars also do not always stop in front of stationary vehicles. So the developers still have a lot of work to do before the system becomes fully self-contained. But talk about full autopilot has been going on since 2018.
Tesla needs to work on more than justvehicle detection, but also driver tracking. At the moment, Tesla cars monitor whether the driver is paying attention to the road only by the position of his hands on the steering wheel. That is, if a person puts his hands on the steering wheel and watches a movie through a smartphone, the car will still assume that the situation on the road is under complete control. Developers, at least, should install sensors for tracking the position of the driver's eyes in the cabin. Such a system has already been implemented in some Cadillac vehicles and may have saved several lives. After all, this technology not only prevents distraction by the smartphone, but also prevents the driver from falling asleep while driving.
If you like our articles, subscribe to us on Google News! This will make it easier for you to keep track of new content.
Don't forget about the other danger of autopilot.Tesla - it can be hacked by intruders. Thus, by hacking into the driving system, criminals can frame the driver and make him guilty of a car accident. Or they may simply use the car to "eliminate" the person. You can read about how Tesla cars are hacked in this article.