China Riding the mmWave with 4D Radar: Trends and Challenges

Shuai Chen
7 min readMar 13, 2024

At the beginning of 2023, Tesla’s intention to deploy 4D mmWave radar drew lots of attention to the emerging sensor technology. While it didn’t end up on the new Model 3, Chinese contenders have gone along exploring its application in mass production EV models. Following Part 1, this article will discuss the trends and challenges of 4D mmWave radar on the China market.

Trends

4D mmWave radar and LiDAR being both competitive and complementary

Automotive sensors use electromagnetic wave to transmit and receive signals except for ultrasonic radar, which relies on mechanical waves. Based on the performance requirements of automotive sensors, only the electromagnetic spectrum in the blue box in exhibit 1 is applicable, which basically means cameras, radars and LiDARs.

Exhibit 1: Electromagnetic Spectrum and Wave Types (source: Muniu)

Among the three, visible light and laser beams are close in wavelength with relatively poor penetration, hence vulnerable to dusts, rain, snow, etc. In contrast, mmWave with a length of about 4mm has very strong penetrating capabilities that are barely affected by bad weather. It can detect objects beyond eyesight but suffers from multipath effect. Obviously, radar is a mandatory sensor that can work with LiDAR for heterogeneous redundancy.

There is a very common scenario that reveals the flaws of camera and radar based perception. Static objects without accurate height information may overlap with a moving vehicle to trigger a false alarm, causing phantom braking or even a rear-end collision. In order to avoid it, the system usually chooses to weight lower on the radar results and higher on vision. But when there is an actual obstacle not recognized by cameras, the system may ignore it causing an accident.

To address such problems, Tesla has applied Occupancy Network while some OEMs choose to install additional 1–2 LiDARs in front that drives up the system cost dramatically. Theoretically, this kind of scenarios wouldn’t be problematic if the radar can measure height.

Therefore, it would make sense to upgrade 3D radars to 4D when the unit price is very close, starting with the front one and then the corner radars. At the moment, intelligent driving experience on mass-market models could be improved by replacing the front radar with slightly increased hardware cost.

According to Gaogong’s survey, 3D front radar costs around 350 RMB on mass production vehicles. Two-chip cascaded 4D radar is about 500–1000 RMB while four-chip cascaded radar is above 1000 RMB.

Furthermore, 4D radar and LiDAR could pair well when it comes to high-end EVs. To enable Navigaiton on Autopilot (NOA) on extended city roads, combining one 4D radar and one LiDAR for front sensing is more cost-efficient than using two LiDARs (exhibit 2).

Exhibit 2: Automotive Sensors Price (source: Guotai Junan)

Radar satellite architecture, a step towards central E/E architecture

Future vehicles will adopt a cross-domain centralized E/E architecture to ensure the system complexity stays manageable. It will eventually reshape the automotive and semiconductor value chain, creating new opportunities while posing challenges to the established business models.

Radar sensors are ready to embrace the future by evolving from edge architecture (exhibit 3) to satellite architecture (exhibit 4), which means to take a significant portion of signal and data processing out of the sensors and centralize it into the powerful ADAS domain controller. The benefits stand out as vehicles approach higher levels of automation with more radars as well as other sensors.

Exhibit 3: example of radar edge architecture (source: TI)
Exhibit 4: example of radar satellite architecture (source: Ambarella)

A classic analogy is the human brain making decisions based on inputs from both eyes instead of each eye working independently. Central processing will facilitate effective sensor fusion algorithms from multiple types of sensors and sensor locations, resulting in more accurate decision-making. It provides the most robust data set possible for artificial intelligence and machine learning, such as the widely discussed BEV (Bird’s Eye-View) and Transformer technology frameworks.

Meanwhile, satellite architecture can reduce cost, size and weight of radar sensors by distancing and replacing the expensive FPGA in signal-processing modules and eliminating redundant components such as power supplies, housings and brackets. It will become simpler and less expensive for OEMs to create diversified products by replicating the same hardware platform and only adjusting sensors and software, enabling system scalability and modularity.

Although Aptiv initiated the concept of satellite architecture in 2020, Chinese companies are rushing to take it to commercialization. Among the list of local players (in Part 1 of this article), Nova has launched a unique 5R solution, meaning 5 radar sensors per car including 1 front 4D mmWave radar and 4 corner radars. The onboard processor in the 4D radar manages the point cloud from the 4 corner radars and then send processed data to the ADAS DCU.

Altos and XretinAI took a step further by offering front-end radar modules and signal processing algorithms tailored to ADAS domain controller. At the moment, Altos has built it on the TI TDA4 and Xretin is based on Black Sesame A1000.

Challenges

The schemes of incorporating 4D mmWave radars will certainly open new capabilities for future vehicles. Nonetheless, the new sensor technology faces challenges on the application front and the hardware itself.

Software defined radar addressing software capabilities

Traditional radar vendors specialize in production and the software to do signal processing and low-level data processing to output point cloud or object list. OEMs get used to take post-processed data to do fusion with cameras in their ADAS development. However, 4D mmWave radar comes with much larger size of raw data, revealing a gap as how to leverage the new granularity while dealing with clutter and noise.

The gap first translates to a challenge to radar vendors on their signal processing capabilities, aiming to increase probability of detection. It is crucial to enhance the 4D point cloud quality that is severely impacted by, for example, the multipath effect. Also, there is a need to refine the information loss during signal processing.

The gap then goes to data processing, meaning how to effectively use the enhanced 4D point cloud to do clustering, object tracking, classification and identification. At the moment, the algorithms of 4D mmWave radar are modified from corresponding LiDAR algorithms without taking into account of 4D radar’s velocity measuring ability and adaptive capability in extreme environments. The radar vendors usually provide this part of software but it won’t be a surprise if OEMs decide to do it in-house.

The last gap is with perception software or sensor fusion algorithms. It is another challenge to mainly OEMs that could make a difference to end-user experience. Radars have only played a supporting role to cameras in the mainstream ADAS systems. It is still blurry as for whether 4D radar should replace 3D radar continuing to provide redundancy or it should be the main sensor. A new weight needs to be assigned to not only 4D radars but also to cameras and LiDARs.

Ideally, an ADAS system could achieve the optimal perception results by fusing cameras, radars and LiDARs. On mass-market vehicles, perhaps 4DRV (4D Radar and Vision) fusion is a cost-effective option. The fusion methods and corresponding AI models both need further research and development.

Disrupting a established value chain with multiple stakeholders

Auto parts are quite demand-driven, where the product design is primarily shaped by the vehicle requirements. OEMs are supposed to define the exact functions to incorporate 4D mmWave radar, the ideal specifications of front and corner radars, the output data format, etc. While these are still uncertain, radar vendors and all stakeholders along the value chain have to be venturous in order to ride the new mmWave.

Additionally, there isn’t a standard lab test yet to validate 4D mmWave radar. Some vendors use the test equipment for conventional mmWave radar, and some use LiDAR as ground truth. Measurement systems and solutions are still limited.

None of the above would be obstacles once large orders are extended to 4D mmWave radars. The supply chain could react promptly to bridge any gaps. On the other side, OEMs are hoping to see the price reaching a certain point before devoting further efforts on the applications. We believe Nio is ready to break the non-virtuous cycle with the recent statement to adopt 4D mmWave radar across all models based on its NT3.0 platform.

--

--

Shuai Chen

Bridging the West and China Innovations in ADAS & Autonomous Driving | B2B Business Development | Go-To-Market Strategies & Execution (schen583@gmail.com)