AutoNet2030 project presentation at the VNC

Multi-Sensor Data Fusion for Checking Plausibility
of V2V Communications by Vision-based
Multiple-Object Tracking
Marcus Obst
BASELABS GmbH
Laurens Hobert
HITACHI Europe
IEEE VNC 2014, Paderborn
Pierre Reisdorf
Technische Universität Chemnitz
Project General Information
Project full title: Networked Automated Driving by 2030
Coordinator:
Andras Kovacs / BroadBit
Project major
partners:
CRF, Volvo Technology
Hitachi, BASELABS, EPFL, ICCS, TU Dresden, Armines, BroadBit
Starting Date:
Ending Date:
November 1, 2013
October 31, 2016
Budget Total/Funding:
4.6 MEUR / 3.3 MEUR
Type of project: European S/M collaborative project
2
Motivation and Objectives
 Development of automated driving technology is a
major current challenge
3
4
Quelle: BMW AG
Motivation and Objectives
 Development of automated driving technology is a
major current challenge
 How to make the best use of the emerging 5.9 GHz
802.11p technology at service of automated driving?
 How can sensing, control and V2X communications be
integrated into a cost-effective on-board system for
automated driving?
5
Straightforward Integration of V2V Communications
V2V
entity
Ego vehicle
ITSG-5 Wireless Unit
CAMs, DENMs
Application/Function
(e.g. Intersection-Movement
Assist, Blind spot assist)
6
Straightforward Integration of V2V Communications
V2V
entity
Do we trust this entity?
Ego vehicle
ITSG-5 Wireless Unit
CAMs, DENMs
Application/Function
(e.g. Intersection-Movement
Assist, Blind spot assist)
7
Plausibility Checking of V2V Communications
V2V
entity
V2V
entity
V2V
entity
Ego vehicle
ITSG-5 Wireless Unit
Plausibility Checking
Cross correlation with
on-board perception
Application/Function
(e.g. Intersection-Movement
Assist, Blind spot assist)
8
Approach of this work
1. Take standard consumer-grade perception and
communication sensors
2. Apply Bayesian multi-sensor data fusion and generate
common perception
3. Derive a measure to decide if a sensor is sending valid
information  perform plausibility checking
9
Take standard consumer-grade perception and
communication sensors
ITSG-5 Unit (Atheros-based) MobilEye Camera
CAMs (position, velocity,
heading, dimensions, time)
1-10 Hz variable
Range, angle
width, velocity
15 Hz fixed
10
Approach of this work
1. Take standard consumer-grade perception and
communication sensors
2. Apply Bayesian multi-sensor data fusion and generate
common perception
11
Exemplary challenges in the data fusion development process
1. Data/measurement synchronization
2. Sensor field of view (FOV) and handover
3. Occluded object
12
Development effort increases with the number of sensors
Sensor fields of view and handover
13
Identified objects have to be tracked and handed over to other
sensors
Sensor fields of view and handover
14
Relevant objects may not be visible to the sensor(s)
Occlusion
!

15
V2V-Communication is introduced to increase the visibility…


16
… and to increase the range of the system!



17
Case study: Handling occluded vehicles in the AutoNet2030
project for 360° perception
CAN bus
Low-cost GPS (ublox LEA6-T)
MobilEye front camera
ITS-G5 equipment for C2C
(Atheros AR5414A-B2B)
 Front radar ARS 308
 GNSS reference sensors for
high-reliable ground truth




18
19
Hide & Seek
20
V2V allows to track occluded objects
MobilEye only
MobilEye + C2C Communication
21
Approach of this work
1. Take standard consumer-grade perception and
communication sensors
2. Apply Bayesian multi-sensor data fusion and generate
common perception
3. Derive a measure to decide whether a sensor is
sending valid information  perform plausibility
checking
22
Can we do plausibility checking?
x

x
23
Plausibility Checking Results
Neutral: Object not visible by on-board
perception
Valid: V2V information complies with on-board
perception
Invalid: V2V is not consistent with on-board
observations
24
Plausibility Checking by Track Score
 Probabilistic confidence measure (existence probability)
 Computed over time
 Considers sensor characteristics (FOV, detection probability 𝑃𝐷
and false alarm probability 𝑃𝐹 )
 Naturally extends to multiple-sensors scenario
Sequential Probability Ratio Testing (SPRT)
25
Including Packet Reception Rate (PRR)
F. Martelli, M. Elena Renda, G. Resta, and P. Santi, “A measurement based
study of beaconing performance in ieee 802.11 p vehicular
networks,” in INFOCOM, 2012 Proceedings IEEE. IEEE, 2012, pp.
1503–1511.
26
Including Packet Reception Rate (PRR)
F. A. Teixeira, V. F. e Silva, J. L. Leoni, D. F. Macedo, and J. M.
Nogueira, “Vehicular networks using the fIEEEg 802.11p standard: An
experimental analysis,” Vehicular Communications, vol. 1, no. 2, pp. 91
– 96, 2014.
27
Including Packet Reception Rate (PRR)
Empirical PRR from measurement data of presented work
28
Results
29
Plausibility Checking: Valid Scenario
Ego vehicle follows V2V vehicle
which finally performs a left turn
maneuver.
30
Plausibility Checking: Valid Scenario
31
Results of Plausibility Checking: Valid Scenario
neutral
CAMs only, baseline solution
32
Results of Plausibility Checking: Valid Scenario
valid
neutral
CAMs + MobilEye
33
Plausibility Checking: Attacker Scenario
Ghost vehicle overtakes from left and
enters FOV of on-board perception
34
Results of Plausibility Checking: Attacker Scenario
neutral
invalid
CAMs + MobilEye
35
What about the efficiency?
Efficient design
of the sensor data fusion for ADAS and
automated driving with BASELABS Connect and Create
36
The selected tools allow the developer to spend his time on the
differentiating parts of the system
System
Integration
Application
and data fusion
Model development (=performance)
Development
time
10%
2450
Lines of code
70%
20%
520
200900
37
Conclusion and Outlook
 Bayesian multi-sensor data fusion approaches can be successfully applied
to plausibility checking
 Standard components can be easily integrated with available tools
 Approach naturally extends to other on-board perception sensors such as
radars
 Development time should be spent on designing and tuning models
Open questions and next steps:
 Perform a full centralized raw-sensor data fusion including raw GNSS
signals
 What is about implementing such a system directly inside of a wireless
unit as kind of application (e.g. based on Linux/ARM)?
38
Thank you!
See full video at
http://bit.ly/1tJCGTo
Marcus Obst
[email protected]
BASELABS GmbH
39
Data fusion component
developed with BASELABS Create
Bird’s Eye
Visualization
Sensor data
input (from real
sensor or
recorded data)
Sensor
calibration info
40