AutoNet2030 booklet

Final Event
27 October 2016
2030
AstaZero, Sandhult, Sweden
Co-funded by the EU
Co-operative Systems in Support
of Networked Automated Driving by 2030
Organised by
AutoNET2030_Α5_final_with out notes.indd 1
5/10/16 11:08
2030
Angelos Amditis
Project Coordinator
The AutoNet2030 project designed and developed
cooperative systems in support of networked
automated driving aiming for a 2020-2030
deployment horizon.
With a total budget of 4.6 million Euros and
co-funding by the European Commission of
3.35 million Euros, 9 partners from 8 European
Countries joined their expertise to contribute
to the vision of cooperative and networked
automated driving technology.
It is a great pleasure to welcome you to the
AutoNet2030 final event which will offer the
opportunity to experience the AutoNet2030
system showcasing demanding vehicleautomation scenarios with associated customer
and societal value.
AutoNET2030_Α5_final_with out notes.indd 2
5/10/16 11:08
2030
Content
The AutoNet2030 project overview
Motivation and Objectives
Technical Approach
4
4
4
Main Results
AutoNet2030 control: distributed decision making in automation
AutoNet2030 perception: integrating 360o multi-sensor data
AutoNet2030 communications: extending V2X messaging
AutoNet2030 HMI: a dual-display (distributed) approach
5
5
5
5
6
Demonstration settings and scenarios
Highway scenario Urban scenario
7
7
7
AutoNet2030 prototype vehicles
SCANIA demonstrator
HITACHI demonstrator
CRF demonstrator
INRIA demonstrator
AutoNET2030_Α5_final_with out notes.indd 3
8
8
9
10
11
5/10/16 11:08
2030
The AutoNet2030 project overview
Motivation and Objectives
Triggered by the so-far limited convergence between sensor-based automation and
cooperative V2X communications, the project carried-out research and validated procedures
and algorithms for 802.11p-based interactive control among co-operative vehicles focusing on:
• Cooperative decentralised control system for fully-automated vehicles and advised
manoeuvring of manually-driven vehicles.
• V2X-message-based communications to enable automated manoeuvre planning and traffic
flow optimization, which have been fed to ETSI ITS standardization groups.
• On-board sensor-based architecture to enable reliable positioning and automated lanekeeping.
Technical Approach
• Research and specifications of
cooperative manoeuvring control
algorithms and information sharing.
• Specification and standardisation of
required enhancements to existing
cooperative communication protocol
standards.
• Development of perception
processing modules and multi-source
data fusion.
• HMI specifications and implementation for advised manoeuvring of manually-driven
vehicles.
• High-fidelity simulation-based evaluation and realistic test-track validation.
AutoNET2030_Α5_final_with out notes.indd 4
5/10/16 11:08
2030
Main Results
AutoNet2030 control: distributed decision making in automation
To mitigate accidents and enhance road efficiency, the AutoNet2030 Manoeuvre and Control
module comprises two cooperative decision-making and manoeuvre functionalities that
allow autonomous vehicles to coordinate in a convoy or at an intersection without traffic light.
In the highway (convoy) scenarios, EPFL developed and implemented a distributed graphbased control algorithm that combines information from perception and V2X communication.
The algorithm has been adapted to the combination of heterogeneous platforms in terms
of length, dynamics and manual or automated control. ARMINES proposed a hierarchical
control architecture that uses a centralized intersection controller to assign crossing priorities
to vehicles and multiple distributed vehicle controllers to cross the intersection safely and
efficiently. Furthermore, all AutoNet2030 cooperative functionalities are supported by a Model
Predictive Control empowered motion planner that generates versatile and obstacle-free
trajectories for different driving scenarios.
AutoNet2030 perception: integrating 360o multi-sensor data
To derive good decisions for automated driving applications, a reliable knowledge of the
surrounding of the host vehicle is required. One major outcome of the AutoNet2030 perception
subsystem is the augmentation of the horizon from local on-board perception sensors (i.e.,
radar, lidar, camera, GNSS) with data from cooperative V2X messaging. In this way, a robust
360O degree coverage is realized where every spot around the vehicle is monitored by at least
two independent sensors. Moreover, as the AutoNet2030 demonstration comprises different
vehicles types (i.e., trucks and passenger cars), a unified environmental model has been
developed which can be easily deployed to different vehicles and configured to particular
sensor setups.
5
AutoNET2030_Α5_final_with out notes.indd 5
5/10/16 11:08
2030
Main Results
AutoNet2030 communications: extending V2X messaging
Use-cases for cooperative autonomous driving, such as convoy driving and
cooperative lane changing, lead to new requirements for vehicle-to-vehicle/infrastructure (V2X) communication. These requirements are beyond the scope
of current V2X communication systems and standards developed in ETSI and
SAE/IEEE. AutoNet2030 has extended the design and the specifications of
the V2X protocols, specifically for facilities-layer and networking protocols,
to support the AutoNet2030 use-cases. These specifications have been
implemented and verified in different environments, including a communication
simulator, an integrated sub-microscopic robotics/communication simulator and
a real prototype.
AutoNet2030 HMI: a dual-display (distributed) approach
To cope with the increased awareness capabilities provided by the ego-vehicle
perception system as well as the cooperative V2X messaging, the AutoNet2030
HMI relies on a carefully-designed system, wire-frame and visual objects to
provide a rich set of data with the maximum clarity. The adopted approach
involves an innovative dual-display system that relies on customized Android
applications to efficiently inform (and occasionally interact with) the passengers
of both automated and manually-driven vehicles.
AutoNET2030_Α5_final_with out notes.indd 6
5/10/16 11:08
2030
Demonstration settings and scenarios
Highway scenario
Setting: Live demonstration in the multi-lane
area of AstaZero active safety test track.
Vehicles involved: One heavy-duty automated
truck (SCANIA), one automated passenger
vehicle (HITACHI) and one completely
manually-driven passenger car (CRF).
Speed: 60-70 km/h
Highlights: Through convoy-related
manoeuvring (e.g., convoy formation
and maintenance, convoy interaction with
manually-driven vehicle) the capability of the
AutoNet2030 system for cooperative sensing,
decision-making and fail-safe manoeuvring
will be demonstrated.
Urban scenario
Setting: Video projection of scenario as
recorded at INRIA premises, France.
Vehicles involved: Fully automated, electric
prototypes
Speed: Up to 30km/h
Highlights: Through rural environment
manoeuvring (e.g., car-following, intersection
crossing) the capability of the AutoNet2030
system to efficiently use cooperative
sensing and coordinate fail-safe cooperative
manoeuvring for urban roads and
intersections will be demonstrated.
7
AutoNET2030_Α5_final_with out notes.indd 7
5/10/16 11:08
2030
AutoNet2030 prototype vehicles
SCANIA demonstrator
SCANIA prototype
Conceptus,
Scania R730 V8
tractor unit
Perception Sensor(s)
Communications
GNSS
Functions
HMI display(s)
Processing unit
AutoNET2030_Α5_final_with out notes.indd 8
• Front long range 77 GHz radar
• Forward looking mono camera
• Side looking 24 GHz radars
• Cellular 4G/LTE modem
• ITS-G5 COHDA unit with Hitachi SW
• Commercial grade GNSS
• Novel RTK enhanced GNSS (~2 cm global accuracy)
• Automated cooperative driving
• Automated convoy driving
• Automated lane change
• AutoNet2030 HMI interaction in Android tablet
• Full screen prototype cluster
• In-dash screen for debugging purposes
• Scania ECU(s) for low level controllers, safety
systems and vehicle gateway functionality
• Scania ECU for RTK processing and output
• Specialized real time processing unit based on Intel
Core i7 for perception, LDM and higher level control
5/10/16 11:08
2030
HITACHI demonstrator
HITACHI prototype
Volkswagen
Passat CC
Perception Sensor(s)
Communications
GNSS
Functions
HMI display(s)
Processing unit
• Front long range 77 GHz radar
• Hitachi stereo-camera
• Clarion all-surround view camera system
• ITS-G5 COHDA Cohda Wireless MK2 Dual channel
with embedded GNSS receiver and magnetic antenna
(Hitachi C2X middleware)
• GNSS with RTK precise positioning solution for AutoNet2030
(Broadbit SW)
• GNSS Positioning (alternative solution) using VBOX 3i (dual
channel + RTK) and IMU (~2 centimetre accuracy)
• Automated cooperative driving
• Automated convoy driving
• Automated lane change
• Integrated Android-based head unit
with AutoNet2030 HMI app
• In-vehicle PC with Intel Gen 5 i7-5650U (2.2GHz – 3.1GHz)
running a vehicle data gateway (HIT SW), higher level vehicle
control, perception, LDM and HMI Support on Linux with RTpatched kernel
• MicroAutoBox for low level vehicle control
9
AutoNET2030_Α5_final_with out notes.indd 9
5/10/16 11:08
2030
AutoNet2030 prototype vehicles
CRF demonstrator
CRF prototype
FIAT 500L
trekking
Perception Sensor(s)
Communications
GNSS
Functions
HMI display(s)
Processing unit
Frontal Lidar Valeo ScaLa 1403:
• Longitudinal Distance Range ≈ 80 m
• Horizontal FoV 145°
• Vertical FoV 3,2°
• Tracked Object List of detected objects, update freq. 25 Hz
• ITS-G5 COHDA Cohda Wireless MK5 Dual channel 802.11p unit, with
embedded GNSS receiver and magnetic antenna (Hitachi SW)
• GNSS with RTK precise positioning solution for AutoNet2030
(Broadbit SW)
• GNSS Positioning (alternative solution) using UBLOX EVK-6H (CRF
SW)
• Manual & Cooperative driving functionalities through AutoNet2030
HMI (CRF & ICCS SW)
• HUD (Android tablet 8”) and HMI interaction tablet (Android tablet
8.4”), with dedicated AutoNet2030 applications (CRF & ICCS SW)
CRF ECU for vehicle dynamic data gateway and Lidar gateway functionalities
(CRF SW), with Intel Gen 3 Core i7-3517UE 1.7GHz
Hosted services:
• Perception and LDM (Baselabs & Hitachi SW)
• Cooperative Convoy controller (EPFL SW)
• HMI management (CRF SW)
AutoNET2030_Α5_final_with out notes.indd 10
5/10/16 11:08
2030
INRIA demonstrator
INRIΑ prototype
Yamaha
electric buses
Perception Sensor(s)
Communications
GNSS
• 2 Ibeo Alasca XT laser sensors
• 1 Sick LMS511
• 1 Ibeo Lux
• Axis 215 PTZ camera
• WiFi device
• 1 Ashtech Z-Xtreme RTK-GPS
• 1 IMU440CA Inertial Motion Unit
Functions
HMI display(s)
Processing unit
• Low speed autonomous driving, obstacle avoidance
• 10’ LCD Display
• 1 Ibeo ECU
• 2 computers in master – slave configuration
11
AutoNET2030_Α5_final_with out notes.indd 11
5/10/16 11:09
Final Event
27 October 2016
2030
CONSORTIUM
PROJECT FACTS
Total Budget
EU Funding
Duration
Start date
End date
Contract n°
Project Coordinator
Call Identifier
4.6 MEUR
3.35 MEUR
36 months
1st November 2013
31st October 2016
610542
Institute of
Communication
& Computer
Systems (ICCS)
FP7-ICT-2013-10
www.autonet2030.eu
CONTACT US Coordinator
Dr. Angelos Amditis
Institute of Communication
and Computer Systems (ICCS)
E-mail: [email protected]
This project has received funding from the European Union’s Seventh Framework
Programme for research, technological development and demonstration under grant
agreement no 610542.
AutoNET2030_Α5_final_with out notes.indd 12
5/10/16 11:09