Go To CAV-TEC V1.0 Use Cases and Functional Requirements home page

 

1       Functions of Environment Sensing Subsystem

2       Reference Architecture of Environment Sensing Subsystem

3       I/O Data of Environment Sensing Subsystem

4       Functions of Environment Sensing Subsystem’s AI Modules

5       I/O Data of Environment Sensing Subsystem’s AI Modules

6       Data Types

1        Functions of Environment Sensing Subsystem

The Environment Sensing Subsystem (ESS) of a Connected Autonomous Vehicle (CAV):

  1. Acquires electromagnetic and acoustic data from the Environment using its sensors.
  2. Receives Environment Data (e.g., temperature, pressure, humidity, etc.) from the Motion Actuation Subsystem.
  3. Receives an initial estimate of the Ego CAV’s Spatial Attitude generated by the Motion Actuation Subsystem
  4. Produces a sequence of Basic Environment Representations (BER) for the journey.
  5. Passes the BER to the Human-CAV Interaction Subsystem (HCI) and Autonomous Motion Subsystem (AMS).
  6. Requests elements of the Full Environment Representations (FER) produced by AMS.

2        Reference Architecture of Environment Sensing Subsystem

Figure 5 gives the Reference Model of Environment Sensing Subsystem.

Figure 5 – Environment Sensing Subsystem Reference Model

The typical sequence of operations of the Environment Sensing Subsystem is:

  1. Compute the CAV’s Spatial Attitude using the initial Spatial Attitude provided by the Motion Actuation Subsystem and GNSS Data.
  2. Receive Environment Sensing Technology (EST)-specific Data, e.g., RADAR Data provided by the RADAR EST.
  3. Produce and send EST-specific Alert, if necessary, to Autonomous Motion Subsystem.
  4. Access the Basic Environment Representation at previous times, if necessary.
  5. Produce EST-specific Scene Descriptors, e.g., the RADAR Scene Descriptors.
  6. Integrate the Scene Descriptors from different ESTs into the Basic Environment Representation.

Note that Figure 5 assumes that:

  1. Traffic Signalisation Recognition produces the Road Topology by analysing Visual Data. The model of Figure 5 can easily be extended to the case where Data from other ESTs is processed to compute or help compute the Road Topology.
  2. Environment Sensing Technologies are individually processed. However, an implementation may combine two or more Scene Descriptors AIMs handling two or more ESTs, provided the relevant interfaces are preserved.

3        I/O Data of Environment Sensing Subsystem

The currently considered Environment Sensing Technologies (EST) are:

  1. GNSS – Global Navigation Satellite System (~1 & 1.5 GHz Radio).
  2. Spatial Attitude of the Ego CAV – Geographical Position and Orientation and their time derivatives up to 2nd
  3. Visual Data in the visible range, possibly supplemented by depth information (400 to 700 THz).
  4. LiDAR Data (~200 THz infrared).
  5. RADAR Data (~25 & 75 GHz).
  6. Ultrasound Data (> 20 kHz).
  7. Audio Data in the audible range (16 Hz to 20 kHz).
  8. Spatial Attitude (from the Motion Actuation Subsystem).
  9. Other environmental data (temperature, humidity, …).

The Offline Map data can be accessed either from stored information or online.

Table 9 gives the input/output data of the Environment Sensing Subsystem.

Table 9 – I/O data of Environment Sensing Subsystem

Input data From Comment
Radar Data ~25 & 75 GHz Radio Environment Capture with Radar
Lidar Data ~200 THz infrared Environment Capture with Lidar
Visual Data Video (400-800 THz) Environment Capture with visual cameras
Ultrasound Data Audio (>20 kHz) Environment Capture with Ultrasound
Offline Map Data Local storage or online cm-level data at time of capture
Audio Data Audio (16 Hz-20 kHz) Environment or cabin Capture with Microphone Array
Microphone Array Geometry Microphone Array Disposition of microphones in the array
Global Navigation Satellite System (GNSS) Data ~1 & 1.5 GHz Radio Get Pose from GNSS
Spatial Attitude Motion Actuation Subsystem To be fused with Pose from GNSS Data
Other Environment Data Motion Actuation Subsystem Temperature, Humidity, etc.
Output data To Comment
Alert Autonomous Motion Subsystem Critical information from an EST.
Basic Environment Representation Autonomous Motion Subsystem ESS-derived representation of Environment

4        Functions of Environment Sensing Subsystem’s AI Modules

Table 10 gives the functions of all AIMs of the Environment Sensing Subsystem.

Table 10 – Functions of Environment Sensing Subsystem’s AI Modules

AIM Function
Audio Scene Description Produces Audio Scene Descriptors and Alert.
LiDAR Scene Description Produces LiDAR Scene Descriptors and Alert.
Online Map Scene Description Produces Online Map Data Scene Descriptors.
RADAR Scene Description Produces RADAR Scene Descriptors and Alert.
Ultrasound Scene Description Produces Ultrasound Scene Descriptors and Alert.
Visual Scene Description Produces Visual Scene Descriptors and Alert.
Traffic Signalisation Description Produces Traffic Signalisation Descriptors.
Spatial Attitude Generation Computes the CAV Spatial Attitude from CAV Centre using GNSS and Motion Actuation Subsystem information.
Environment Data Fusion Receives Scene Descriptors and critical Environment Representation as Alert from the different ESTs.

Produces Alert and Basic Environment Representation (BER) including all available information from ESS:

1.      The different Scene Descriptors generated by the different EST-specific Scene Description AIMs.

2.      Environment Data.

3.      The Spatial Attitude of the Ego CAV (Figure 6).

Figure 6 – Spatial Attitude in a CAV

5        I/O Data of Environment Sensing Subsystem’s AI Modules

For each AIM (1st column), Table 11 gives the input (2nd column) and the output data (3rd column) of the Environment Sensing Subsystem. Note that the Basic Environment Representation in column 2 refers to the previously produced BER.

Table 11 – I/O Data of Environment Sensing Subsystem’s AI Modules

AIM Input Output
Audio Scene Description –          Audio Data

–          Basic Environment Representation

–          Alert

–          Audio Scene Descriptors

Visual Scene Description –          Visual Data

–          Basic Environment Representation

–          Alert

–          Lidar Scene Descriptors

Lidar Scene Description –          Lidar Data

–          Basic Environment Representation

–          Alert

–          Lidar Scene Descriptors

Radar Scene Description –          Radar Data

–          Basic Environment Representation

–          Alert

–          Radar Scene Descriptors

Ultrasound Scene Description –          Ultrasound Data

–          Basic Environment Representation

–          Alert

–          Ultrasound Scene Descriptors

Map Scene Description –          Offline Map Data

–          Basic Environment Representation

–          Alert

–          Map Scene Descriptors

Traffic Signalisation Description –          Visual Data

–          Basic Environment Representation

–          Alert

–          Traffic Signalisation Descriptors

Spatial Attitude Generation –          GNSS Data

–          Spatial Attitude from MAS

–          Spatial Attitude
Environment Data Fusion –          RADAR Scene Descriptors

–          LiDAR Scene Descriptors

–          Traffic Signalisation Descriptors

–          Lidar Scene Descriptors

–          Ultrasound Scene Descriptors

–          Map Scene Descriptors

–          Audio Scene Descriptors

–          Spatial Attitude

–          Other Environment Data

–          Alert

–          Basic Environment Representation

6        Data Types

An initial version of the ESS Data Types is available.