Wet Road AEB Tests: LiDAR vs Vision Safety Analysis

Compatibilità
Salva(0)
Condividi

Wet Road AEB Tests: When LiDAR Meets Low-μ — What Current Sensors Miss

Comparative AEB test under heavy rain: BYD (LiDAR), Tesla Model 3 (Vision-only), Xiaomi SU7 (solid-state LiDAR).

A series of comparative tests pitting BYD, Tesla, and Xiaomi electric vehicles against each other under heavy rain has highlighted a structural limitation of current Autonomous Emergency Braking (AEB) systems: detecting an obstacle is not enough if the system does not know how slippery the road surface is. The results illustrate why sensor diversity and road friction estimation are becoming the next critical frontier in ADAS safety.


The Performance Gap: LiDAR vs Vision

The three vehicles tested represent three distinct sensor philosophies. The BYD Seal/Han lineup — equipped with Robosense LiDAR, five millimeter-wave radars, twelve ultrasonic radars, and twelve HD cameras within the “God’s Eye” platform — demonstrated more consistent obstacle detection in wet conditions. The Xiaomi SU7 Pro/Max, featuring a roof-mounted solid-state Hesai AT128 LiDAR (128 independent laser channels, 200-meter range), paired with a 4D millimeter-wave radar and eleven HD cameras, performed comparably. The Tesla Model 3 Hardware 4, operating on a pure vision-only system — eight 5-megapixel cameras and no radar or ultrasonic sensors — showed measurable degradation in detection latency as rain intensity increased.

Rain degrades optical sensors through two mechanisms: attenuation of the reflected signal (in the case of LiDAR) and contrast reduction with surface glare (in the case of cameras). According to independent research, camera performance degrades proportionally with rainfall rate and vehicle speed, while LiDAR point clouds thin out significantly in heavy precipitation. Radar remains the most weather-resilient modality, maintaining functional detection beyond 260 meters even in dense fog — but Tesla’s HW4, in its base configuration, removes radar from the equation.

AAA testing confirms the severity of the issue: in simulated moderate-to-heavy rain at 35 mph (approximately 56 km/h), 33% of vehicles with active AEB collided with a stationary car. Lane-keeping systems failed in 69% of cases under the same conditions.


Why Visibility is Only Half the Battle

Even when a sensor correctly identifies an obstacle, the AEB system’s ability to stop the vehicle depends on a factor none of these sensor suites currently measure in real time: the road friction coefficient (μ).

The friction coefficient quantifies the grip available between tire and road surface. On dry asphalt, μ typically ranges from 0.7 to 0.8. On wet surfaces, a thin water layer reduces effective contact area, dropping μ to 0.4–0.6. In partial aquaplaning, values can fall below 0.3. The physical consequence is direct: halving μ from 0.8 to 0.4 doubles the stopping distance. At 0.2, it quadruples.

Most AEB algorithms calculate Time-to-Collision (TTC) and initiate maximum braking based on a fixed assumption of dry-road deceleration capacity. When friction is halved by rain, the system triggers braking correctly — but the vehicle slides past the stopping point because the physics no longer match the algorithm’s model. Detecting the obstacle is necessary but not sufficient; knowing the road’s grip state is equally critical.


Technical Data & Safety Standards

  • AAA (2021): 33% AEB collision rate in moderate-to-heavy rain at 35 mph; 69% lane-keeping failure rate under same conditions. Source: AAA Newsroom.
  • Friction values: Dry asphalt μ = 0.7–0.8; wet asphalt μ = 0.4–0.6; aquaplaning onset μ < 0.3.
  • Stopping distance physics: Halving μ from 0.8 to 0.4 doubles required braking distance; reduction to 0.2 quadruples it.
  • Euro NCAP 2026 protocols: New standards will award additional points for ADAS systems demonstrating robustness in low-visibility conditions, including integration of real-time weather and hazard data.
  • LiDAR degradation: Point cloud density and return intensity decrease measurably in heavy rain; solid-state units like the Hesai AT128 absorb some of this impact through higher channel density.
  • Radar resilience: Minimum detection range in heavy fog can remain above 260 meters — versus near-zero visibility for cameras — per U.S. DOT research data.

Solving the Low-μ Challenge with Virtual Sensors

Easyrain addresses the friction gap with DAI (Virtual Sensor Platform), a software-only Virtual Sensor Platform that operates without additional hardware, without tire dependency, and without cloud connectivity. DAI analyzes vehicle dynamics in real time to estimate road surface conditions — including low-μ states, aquaplaning onset, snow and ice — and delivers this data directly to ADAS systems.

DAI’s Virtual Sensor LOW/HIGH-μ module provides real-time detection of whether available grip is above or below 0.5, enabling the vehicle’s AEB and stability systems to recalibrate braking parameters to actual road conditions rather than fixed assumptions. Detection occurs in milliseconds, with self-calibration, and without dependence on road type, tire brand, or external data sources. This transforms μ from an unknown variable into a known input — closing the gap that camera, radar, and LiDAR arrays leave open.

Additional DAI modules extend coverage to aquaplaning (partial and full, with independent side sensing), snow and ice (detection before tire slip occurs), irregular terrain, tire pressure loss (ITPMS-equivalent), tire wear (±0.5 mm tread depth accuracy), and wheel misalignment — all from the same software stack integrated into existing vehicle architecture.


The Future of Active Safety

Sensor diversity — the combination of LiDAR, radar, and cameras — improves object detection in adverse weather. Yet detection alone cannot compensate for physics. The next generation of safety systems will need to close the loop between perception and grip.

Easyrain’s roadmap illustrates this trajectory. AIS (Active Safety System) is the first active system to restore grip before control is lost: by spraying pressurized fluid ahead of the tires, it physically eliminates the water layer causing aquaplaning, delivering a documented −20% braking distance reduction on heavy wet surfaces and +225% lateral traction increase in aquaplaning conditions. The system weighs as little as 2.7 kg in its lightest configuration and integrates into existing vehicle architectures without redesign.

ERC (Cloud Infrastructure) extends this intelligence to the fleet and city level, aggregating real-time surface data — aquaplaning zones, low-grip corridors, road degradation — into a live map accessible to connected vehicles, fleet operators, and infrastructure managers. With AI integration, ERC evolves into a predictive layer that anticipates hazards before a vehicle encounters them.

The comparative EV test under rain is a stress test for today’s ADAS. The results show that the sensor arms race — more cameras, higher-resolution LiDAR, wider radar bands — is necessary but incomplete. Without real-time friction data, autonomous systems remain partially blind to the most critical variable in emergency braking: the road itself.


Expert Knowledge Panel: Frequently Asked Questions

Q: Why does LiDAR perform better than cameras in wet road AEB tests?

A: LiDAR emits its own laser pulses and measures the time of return, making it less dependent on ambient light and road surface reflections than cameras. In rain, cameras suffer from contrast reduction, glare, and water on the lens or windshield, all of which impair object detection. LiDAR is also affected by heavy rain — precipitation scatters the laser pulses and reduces point cloud density — but at moderate rainfall intensities it maintains more reliable obstacle detection than vision-only systems. Radar offers the strongest weather resilience but provides lower spatial resolution for close-range AEB scenarios.

Q: What is the road friction coefficient (μ) and how does it impact AEB stopping distance?

A: The road friction coefficient (μ) is a dimensionless number that quantifies the grip between a tire and the road surface. A higher μ means more available traction. On dry asphalt, μ typically ranges from 0.7 to 0.8. On wet roads, water acts as a lubricant between tire and pavement, reducing μ to 0.4–0.6. Because stopping distance is inversely proportional to μ, halving the friction coefficient from 0.8 to 0.4 doubles the braking distance required to stop a vehicle. Most AEB systems currently calculate braking interventions assuming dry-road grip levels, which means the actual stopping distance in wet or low-μ conditions will exceed the system’s prediction — a gap that can result in a collision even when the AEB activates correctly.

Q: How does Easyrain DAI improve AEB performance on wet roads without additional hardware?

A: Easyrain DAI (Virtual Sensor Platform) is a software-based Virtual Sensor Platform that analyzes wheel speed signals, vehicle dynamics, and other data already available in the vehicle’s ECU to estimate road surface conditions in real time — including the current friction coefficient (μ), aquaplaning onset, snow, and ice. Because DAI operates entirely in software, it requires no additional sensors, no internet connection, and no cloud dependency. Its LOW/HIGH-μ virtual sensor detects in milliseconds whether grip has fallen below the 0.5 threshold, allowing the vehicle’s AEB and stability control systems to adjust their braking calculations to actual road conditions rather than fixed dry-road assumptions. This closes the gap between obstacle detection — handled by LiDAR, radar, and cameras — and the physics of stopping on a slippery surface.

Recapiti
admin