Logo FREA: Feasibility-Guided Generation of Safety-Critical Scenarios with REasonable Adversariality


CoRL 2024 (Oral)

1School of Vehicle and Mobility, Tsinghua University
2The University of Hong Kong
(Corresponding authors. )

FREA incorporates feasibility as guidance to generate adversarial yet AV-feasible, safety-critical scenarios for autonomous driving.

Abstract

Generating safety-critical scenarios, which are essential yet difficult to collect at scale, offers an effective method to evaluate the robustness of autonomous vehicles (AVs). Existing methods focus on optimizing adversariality while preserving the naturalness of scenarios, aiming to achieve a balance through data-driven approaches. However, without an appropriate upper bound for adversariality, the scenarios might exhibit excessive adversariality, potentially leading to unavoidable collisions. In this paper, we introduce FREA, a novel safety-critical scenarios generation method that incorporates the Largest Feasible Region (LFR) of AV as guidance to ensure the REasonableness of the Adversarial scenarios. Concretely, FREA initially pre-calculates the LFR of AV from offline datasets. Subsequently, it learns a reasonable adversarial policy that controls critical background vehicles (CBVs) in the scene to generate adversarial yet AV-feasible scenarios by maximizing a novel feasibility-dependent objective function. Extensive experiments illustrate that FREA can effectively generate safety-critical scenarios, yielding considerable near-miss events while ensuring AV's feasibility. Generalization analysis also confirms the robustness of FREA in AV testing across various surrogate AV methods and traffic environments.

Feasibility-dependent Objective

The training objective of the proposed FREA method, which consists of two different objetive:

  1. AV-feasible: Follow the original goal-based adversarial objective.
  2. AV-infeasible: Minimize the severity of infeasibility.

Experiment Results

Largest Feasible Region Visualization

These cases show the well-trained LFR's reliability under various scenarios, providing a solid basis for CBV training.


Near-miss Metrics

There figures show the distributions of TTC and PET and minimum vehicle distance across different CBV methods. The PPO produces the most adversarial scenarios across all maps. Conversely, FPPO-RS and FREA present similar near-miss metrics but with a higher frequency of near-miss than Standard traffic flow. Given our focus on AV-feasible near-miss events, further analysis of AV's feasibility is required.


Near-miss Events


Image 1 Image 2

FREA generates consecutive near-miss events, enhancing scenario adversariality without causing collisions.


Feasibility Metrics

Without the feasibility constraint, the PPO method lead to many severe collisions with high infeasible rates, indicating excessive adversarial behavior. Conversely, under the feasibility constraint, both FPPO-RS and FREA mitigate the severity of these events. However, as for AV's feasibility, FPPO-RS struggles with hard constraints, causing many infeasible events. In contrast, FREA strikes a better balance between adversarial and AV-feasible, resulting in the least severe collisions.


AV Testing Performance


FREA generalizes well in AV testing due to its AV-independent training objective.


AV Training Performance


AV pretrained with FREA demonstrate improved performance and robustness, benefiting from the AV-feasible near-miss data generated by FREA.

Demos

Overtaking at intersections

Vehicle turn-around

Overtaking on Straight Road

Overtaking at intersections

Vehicle crossing at intersection

Overtaking on Straight Road

Vehicle turn-around

Vehicle crossing at intersection

Vehicle Reversing

Vehicle turn-around

CoRL Poster

BibTeX

@inproceedings{
  chen2024frea,
  title={{FREA}: Feasibility-Guided Generation of Safety-Critical Scenarios with Reasonable Adversariality},
  author={Keyu Chen and Yuheng Lei and Hao Cheng and Haoran Wu and Wenchao Sun and Sifa Zheng},
  booktitle={8th Annual Conference on Robot Learning},
  year={2024},
  url={https://openreview.net/forum?id=3bcujpPikC}
}