Tesla autopilot does not improve driving safety

image

When this Tesla crashed into the store, it was in manual mode. Most accidents happen on city streets, not on highways where autopilot is used.

Researchers attending the annual Automated Vehicle Summit have demanded that Tesla install a camera-based driver monitoring system in its vehicles.

Tesla’s autopilot is equipped with an additional system to ensure that the driver is watching the road. If you keep your hands on the steering wheel and periodically apply force to it, a special sensor will be triggered in the steering wheel. If the system notices that the driver has removed his hands from the steering wheel, then it will begin to issue warnings: first on the screen, then sound signals will sound, and if the driver does not react, the car will slow down and stop. This system does not always work absolutely accurately – some drivers keep their hands on the steering wheel, and the system does not notice it. Also, the system can be “overcome” – simply by attaching a counterweight to the steering wheel.

Other systems (such as GM’s Super-Cruise) use cameras to monitor the driver and monitor how closely they follow the road. The most advanced systems track the direction of the gaze and record how long the driver looks around, at the devices, at the phone, or how long it takes to adjust the mirrors. These systems may require that you keep your eyes on the road, even if you take your hands off the wheel.

Some resonant accidents (including fatal ones) were caused by drivers who did not pay attention to the road. In most cases, we do not know what these drivers were distracted by, but if they followed the road, these accidents would not have happened. Several reports of accidents involving Tesla vehicles indicate that the driver removed his hands from the steering wheel, although the driver monitoring system did not record this. Rather, the reports indicate that the sensor did not sense the force being applied to the steering wheel. Any Tesla owner will tell you that sometimes he gets notifications demanding to put his hands back on the wheel even when he is holding it and watching the road.

In reports about fatal accident involving Tesla in Mountain View indicates that the driver was playing an online game a few minutes before the accident. He probably played it a few moments before his death.

The Tesla Model 3 has a camera that looks into the cabin and it could follow the driver, the problem is that Tesla refused to use it for this. It seems that this refusal does not make any sense, but what reasons could it have? Let’s consider some of them:

  1. Driver surveillance is somewhat daunting in terms of privacy. Not all customers agree on this, and Tesla doesn’t want to scare its customers.
  2. Driver control systems can be annoying – they are not perfect and can be annoying when the driver cannot be distracted. For example, Tesla cars have a system that makes a loud sound when crossing a lane. This system can be very useful if you accidentally cross a lane, but it does not take into account the fact that when driving on a free road, many drivers change to more comfortable lanes, crossing the lane markings.
  3. Early models (such as the luxury Model S) did not have cameras. Tesla may not want the Model 3 to have something that the more expensive Model S.
  4. Tesla obviously thinks their current system is working as it should. The company regularly notes the safety performance of Autopilot, even considering accidents caused by inattention of drivers.

According to many experts (including Brian Raymer, a researcher at MIT, who presented very reliable research on the use of Autopilot), these excuses are not enough.

It’s especially interesting to look at Tesla’s claims regarding the safety of their autopilot system. Tesla publishes key crash statistics in a report every quarter Vehicle Safety Report about the safety of their vehicles. The Q1 2020 report reflects the significant reduction in mileage and road traffic accidents caused by the pandemic. The Q4 2019 report states the following:

“In Q4, we recorded a road accident every 3.07 million miles driven by autopilot. When driving without autopilot, but using our safety technologies, road accidents occurred every 2.1 million miles. Finally, drivers who did not use any of these features had an accident every 1.64 million miles. In comparison, accidents in the United States occur every 479,000 miles, according to the National Highway Traffic Safety Administration. “

These numbers seem incredible at first, but if you read it carefully, you can see the problem. National Highway Traffic Safety Administration statistics refer to “accidents” while Tesla speaks of “road accidents.” Tesla declined to explain what a “traffic accident” means and how it differs from an accident. The department keeps track of the traffic accidents that are recorded by the police. There is a suspicion that Tesla may be keeping track of airbag accidents as the company is notified of such incidents. It is unknown if Tesla will find out about more minor accidents. Without being able to figure out exactly what Tesla calls a “traffic accident,” the numbers cannot be compared. And the company shouldn’t assume that you will succeed. Tesla representatives declined to comment on all these questions.

Be that as it may, the most interesting figures are the ratio of 3.07 and 2.1 million miles without crashes with the autopilot on and off, respectively.

Tesla doesn’t get safer with autopilot on

Reimer also provided data from an upcoming study on autopilot systems. From these data, it can be seen that approximately 94% of the autopilot’s mileage is driven on expressways. About 40% of the mileage accounted for by hand control and driving with a cruise control on motorways, the remaining 60% were covered on roads in settlements. It is difficult to find exact figures, but driving on the highway is safer than driving on normal roads – fatal accidents happen three times less often (although if you recalculate the frequency in hours, the decrease is not so significant). The danger of high speeds is offset by the ease of driving. The incidence of all accidents is less clear, but let’s make the same estimates for it (on rural roads, fatal accidents happen 2.5 times more often than on urban ones, although the autopilot is used both there and there).

Of the 2.1 million miles between manual accidents, 880,000 are on the highway and 1.2 million on the rest of the road. In the case of autopilot, of the 3.07 million miles on the highway, 2.9 million and only 192,000 on other roads. Thus, the statistics of accident-free driving in manual mode is about 4.5 million miles on the highway and 1.5 million on all others. The accident-free driving statistics of the autopilot are 1.1 million miles between road accidents and 3.5 million for others.

In other words, in manual mode (with oncoming traffic collision protection enabled) and with cruise control, the accident-free mileage is 30% more than when driving with autopilot. So, self-driving in Tesla is slightly less safe than it is claimed.

However, security is not reduced that much. Even if 3: 1 is too high a ratio of accidents on the highway and other roads, these numbers are not far from the truth. But almost certainly the 1.5x improvement that Tesla claims doesn’t exist.

Tesla’s problem is that people who want to abuse autopilot get into accidents because autopilot is a flawed driver assistance system (and that’s how the company sells it). The overall increase in statistics comes from people using autopilot responsibly. Thus, in terms of statistics, the autopilot either slightly reduces driving safety or does not affect it at all. If you use the autopilot correctly, your safety is marginally improved. If you do not follow the rules for operating the autopilot, you are at very high risk. Overall, this system does not improve overall ride safety in Tesla vehicles, and it does not appear to improve it in the future.

We do not know what proportion of accidents involving Tesla vehicles are caused by inattentive drivers, but the available data suggests that this proportion could be significant. If so, then Reimer is right – Tesla should consider implementing a driver monitoring system. Their new AI chip needs to manage a system that helps, not annoys, the driver. This technology will make shoppers happy and safer. The issue of storing video recordings after accidents was also discussed. I believe that you should not be allowed to record in the vehicle without your consent. Supporters of this function should be able to enable it – it can confirm their guilt in the accident or justify them. The use of a driver monitoring system may also be optional. In this case, the driver who turned it off may be held liable in the event of an accident using the autopilot.

I have asked Tesla several times to provide data to separate accidents involving their vehicles by road type, but they refused. Some of the numbers given here are estimates and extrapolated from fatal accident data. I would like to receive more accurate data from Tesla.

It should be noted that Tesla vehicles are generally very safe. These cars have some of the highest crash test scores of any vehicle, and excellent collision avoidance systems have played an important role in generating good off-autopilot safety statistics. This article is only concerned with comparing safety performance with and without autopilot, but using ADAS. The author owns a Tesla, and part of the reason for the purchase was their excellent safety record.

Subscribe to channels:
@TeslaHackers – a community of Russian Tesla hackers, rental and drift training on Tesla
@AutomotiveRu – auto industry news, hardware and driving psychology


image

About ITELMA

We are a large development company automotive components. The company employs about 2,500 employees, including 650 engineers.

We are, perhaps, the strongest center of competence in the development of automotive electronics in Russia. Now we are actively growing and have opened many vacancies (about 30, including in the regions), such as software engineer, design engineer, leading development engineer (DSP programmer), etc.

We have a lot of interesting tasks from car manufacturers and concerns that are driving the industry. If you want to grow as a specialist and learn from the best, we will be glad to see you in our team. We are also ready to share our expertise, the most important thing that happens in automotive. Ask us any questions, we will answer, we will discuss.

Read more helpful articles:

  • Free Online Courses in Automotive, Aerospace, Robotics and Engineering (50+)
  • [Прогноз] Transport of the future (short-term, medium-term, long-term horizons)
  • The best materials on hacking cars from DEF CON 2018-2019
  • [Прогноз] Motornet – data exchange network for robotic transport
  • Companies spent $ 16 billion on self-driving cars to capture $ 8 trillion market
  • Cameras or lasers
  • Autonomous cars on open source
  • McKinsey: rethinking software and electronics architecture in automotive
  • Another OS war is already underway under the hood of cars
  • Program code in the car
  • In a modern car, there are more lines of code than …

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *