AEB is a system of autonomous emergency braking of a vehicle (Automatic Emergency Braking).
ADAS is an advanced driver-assistance system.
Last year, AAA (American Automobile Association) tested (saved google copy of report) on vehicles equipped with ADAS specifically oriented to pedestrian detection (AEB-P). The AEB-P was tested on four cars in the 2019 model year: Chevrolet Malibu with Front Pedestrian Braking, Honda Accord with Honda Sensing-Collision braking system, Tesla Model 3 with Automatic Emergency Braking and Toyota Camry with Toyota Safety Sense.
Here are the key findings:
- If in daylight, the test car at a speed of 20 miles per hour met an adult crossing the road, then the car avoided a collision with a pedestrian in only 40 percent of cases.
- If a test vehicle moving at a speed of 20 miles per hour met a child rushing into a stream of traffic between two cars, the child was hit in 89 percent of cases.
- At 30 mph, none of the test vehicles escaped a collision.
The results prompted the AAA to issue recommendations that include: “Never rely on pedestrian detection systems to avoid a collision. These systems serve as backup rather than primary collision avoidance. ”
Collision warning vs collision avoidance
It is important to note the difference between a collision warning and a collision avoidance system. A warning system will warn the driver of an imminent collision, but will not take any avoidance maneuvers (such as braking). The prevention system warns the driver, and if no action is taken, the system will begin to brake to avoid or reduce the severity of the collision.
The “Prevention System” was what the AAA rated “pedestrian detection” in its tests.
For any layman, seeing a car with ADAS not stopping in front of a pedestrian is a shock. Despite the fact that the results of the AAA tests received widespread press coverage, the video makes you think about a lot of unanswered questions.
All four AAA-tested vehicles use a camera + radar. Given this combination, what elements make AEB-P functions function so inefficiently?
- Is there a problem due to insufficient resolution in the image sensor and / or radars?
- Or is it due to algorithms for merging data from various sensors?
- There is a hypothesis that the use of thermal imaging sensors helps vehicles to see pedestrians at night. We have no doubt about that. But, in this case, is it possible to easily solve this problem by simply adding another sensor (of another modality) over the sensors already installed in these ADAS vehicles?
What makes AEB-P so difficult to implement?
Phil Magni, founder and director of VSI Labs, told the EE Times: “AEB is fundamental to ADAS and you couldn’t even think of driving automatically without it. In addition, this is the most important of all ADAS features and is the only application that has the potential to save most lives. ”
However, Phil Magni makes a crucial difference between AEB and AEB-P. “AEB adapted to pedestrians,” he emphasized, “is“ an order of magnitude more complicated than AEB. ”
So what are the difficulties?
Experts often refer to false alarms that radars are prone to, and the limited field of view provided by image sensors. Even when radars and cameras are combined, the merged data can still give only a limited idea of the environment of the vehicle.
Perhaps the most important is the issue of value. Automakers typically use less expensive sensors for ADAS vehicles. Given that ADAS features are expected in mass-market vehicles, it is unlikely that car manufacturers will pay more for special sensors — be it lidar or thermal imager — to reduce the likelihood of AEB-P malfunctioning.
Phil Magni noted that AEB is difficult because “false alarms in the context of AEB alone can lead to mortal dangers.”
Phil Magni explained that radar is a critical component in AEB systems because of its ability to measure time before a collision. But the radar is also prone to false positives, for example, taking parked cars for dangerous objects.
So, in the end, you have to filter out a lot of data in the interest of limiting false positives. You also have a lot of noise in the radar, and this can also lead to false positives. That’s why you get weird collision alerts from time to time if your car has a collision warning function. “
Against the background of general AEB data, the expert explained: “AEB-P significantly increases performance requirements, because now you need to identify and track people in your path.” He admitted that the radar is getting better, “but he still lacks confidence when dealing with people, so you usually connect it to the camera.”
But here is the thing. “Although connecting the camera to the AEB-P radar has a good result, it may not be enough.”
According to the expert, “there are so many environmental conditions that limit camera performance, and this leads to poor performance of modern AEB-P systems.”
Narrow field of view
Yole Développement analyst told the EE Times that the success of the AEB system, based on a camera, or a radar, or a camera + radar, or a camera + laser long-range meter, has shown good results in terms of safety. The world sees “more or less, 50 percent fewer accidents and deaths and 10-15 percent less accidents / deaths in general,” he said.
In March 2016, most U.S. equipment manufacturers promised to install AEB on all vehicles by 2022. In April 2019, the EU parliament also voted for mandatory equipment by 2022. (Source: Yole Développement)
But when the same AEB technology is used to detect pedestrians, statistics – 10-15 percent fewer crashes / deaths – are not so encouraging.
Answering the question why AEB-P is difficult to do, the expert said that the problem lies in the “relatively narrow field of view” in front of the car in first-generation AEB systems.
These first-generation systems use Vision processors such as Intel-Mobileye EyQ3 (in GM, Ford, VW) or Toshiba Visconti 2 (in Toyota). Referring to the relatively narrow field of view of these vehicles, the expert noted: “This is the main reason why the AEB system cannot understand much more than what is happening in front of the car.”
According to experts, the first-generation AEB system has already been deployed in about 6% of cars on the road and in 30% of new cars. First-generation AEBs are between 10 and 15 percent efficient, so vehicles equipped with AEB in North America and Europe by 2022 will be far from achieving the often cited Vision Zero goal.
But over time, as expected, things will get better.
“The new generation of AEB systems is based on Intel-Mobileye EyeQ4 or Visconti 4, and they will improve this FOV parameter, as a rule, by increasing the number of cameras with a wider field of view,” the expert noted.
“Today we don’t know about the advantages of a triple camera compared to a monocamera, but it should be better.”
Next up are third-generation AEB systems. The expert noted that they will use full-range cameras. “This is what Tesla will do with its fully self-managed computer (FSD). Zenuity also provides this approach to OEMs, ”he added. “Being aware of the complete environment, AEB needs to improve over time. But the question is, how fast?
What needs to happen for the AEB to protect pedestrians from colliding with an ADAS car? experts suggest that pressure will be needed on car manufacturers from regulators or protests from the general public.
What do we need for an effective AEB-P?
Flir offers its thermal imaging technology for the AEB-P. The company claims that their thermal imager provides „additional data for RGB cameras and radars. As thermal imaging cameras “see” heat, Chris Posh, Flir’s technical director in charge of the automotive industry, said: “We can detect pedestrians in difficult conditions, including at night, through the glare of the sun, headlights and fog.” Flir claims that he can see four times farther than typical headlights in the dark.
Meanwhile, at CES, Prophesee from Paris showed a video created by an unnamed automaker in Germany. They compare an AEB system using a conventional camera with an event-driven camera. The video showed that the Prophesee camera constantly scored more points when a pedestrian was detected.
There are three ways to overcome the obstacle with AEB-P improvement.
- The same data (only more), the same calculations (only more)
- Better data with the same calculations
- Better data, better calculations
The third approach – this is a combination of new sensors with new methods of calculation. “I think these are promising neuromorphic calculations. Some companies are already innovating in both sensors and computing … I mean Outsight, which brings to the market hyperspectral lidar + perception algorithm. “
Among the currently available solutions, thermal imaging cameras are promising. Compared to conventional RGB cameras, an expert from VSI Labs said: “Thermal energy detects and classifies pedestrians much better, because the classification is based on the thermal characteristics of the object, and not on visible light.”
But the most frequently asked question about thermal imaging cameras is the cost. If car manufacturers add a thermal imaging camera to a car with ADAS to provide an efficient AEB-P, how much will it cost? Chris Posh told the EE Times, “They will cost hundreds of dollars, not thousands, as is the case with lidars.”
Although Flir thermal imaging cameras have already been developed on some BMW, Audi, and other models, they are not designed or configured for the AEB-P. Instead, they can spot animals on the road at night. For applications, AEB-P Flir has developed a new VGA thermal imaging camera, the resolution of which is four times higher than that of modern thermal imaging automotive cameras.
Last fall, Veoneer (a Swedish automotive technology provider) chose Flir for its fourth-level autonomous car manufacturing contract with a leading global automaker (for 2021).
How to check it
VSI Labs, with whom Flir has contracted, is working on a concept check to demonstrate the benefits of thermal imaging cameras for automatic emergency braking. VSI Labs conducted initial tests in December 2019 at the American Mobility Center near Detroit.
The VSI Labs model for this AEB-P test used, according to Magni, one Delphi ESR radar combined with a Flir camera. “We have disabled the RGB channel in this test. We had to combine data from other sensors coming from the CAN bus, such as inertia, wheel speed, steering angle, pedal position, etc. This was necessary for programming the AEB functionality. ”
Besides stating that as a passive sensor, nothing detects pedestrians better than a thermal imaging camera, Phil Magni mentioned the role of artificial intelligence in the performance of thermal imagers.
He said: “At VSI, we proved that using artificial intelligence to capture thermal images can outperform a traditional RGB camera.” VSI labs trained their neural network using the Flir ADK (automotive development kit) dataset. “He noted that” the dataset contains approximately 40,000 annotated thermal images. “” VSI also developed AEB algorithms and then conducted numerous tests in ACM (Active Control Mount), ”he explained.
Phil Magni concluded that, in general, a thermal imaging camera better recognized and classified pedestrians in low light and cluttered conditions. “The thermal imager also revealed pedestrians that were partially closed,” he added.
In addition, he said: “What we like about Flir is their Automotive Development Kit, because it gives the developer the ability to create their own detection algorithms.”
We are perhaps the most powerful competence center in Russia for the development of automotive electronics in Russia. Now we are actively growing and we have opened many vacancies (about 30, including in the regions), such as a software engineer, design engineer, lead development engineer (DSP programmer), etc.
We have many interesting challenges from automakers and concerns driving the industry. If you want to grow as a specialist and learn from the best, we will be glad to see you in our team. We are also ready to share expertise, the most important thing that happens in automotive. Ask us any questions, we will answer, we will discuss.
Read more useful articles:
- [Прогноз] Transport of the future (short-term, medium-term, long-term horizons)
- The best materials for hacking cars with DEF CON 2019-2020
- [Прогноз] Motornet – a data exchange network for robotic vehicles
- Companies spend $ 16 billion on drones to capture 8 trillion market
- Cameras or lasers
- Autonomous cars on open source
- McKinsey: Rethinking Software and Electronics Architecture in Automotive
- Another OS war is already under the hood of cars
- Program code in the car
- In a modern car, there are more lines of code than …