This is the public page of the Connected Autonomous Vehicle (CAV) standard.  See the MPAI-CAV homepage.

MPAI-CAV is an MPAI project aiming at standardising all IT components required to implement a Connected Autonomous Vehicle (CAV), i.e., a system capable of moving autonomously – save for the exceptional intervention of a human – based on the analysis of the data produced by a range of sensors exploring the environment and the information transmitted by other sources in range, e.g., other CAVs and roadside units (RSU). Standardisation of CAV components, called AI Modules (AIM) in MPAI, will cover their functionality and input/output data formats.

Continue reading for a general overview and reviewthe MPAI-CAV Use Cases and Functional Requirements WD0.10 and MPAI-CAV Environment Sensing Subsystem (ESS).

Fig. 1 – The 5 MPAI-CAV subsystems
MPAI-CAV addresses 4 main interacting CAV subsystems. HCI gives instructions to CAV, sensors in ESS (and MAS) allow CAV to create a Basic World Representation (BWR). BWRs are exchanged with CAVs in range  and allow AMS to create the Full World Representation based on which commands are given to AMS.
Fig. 2 – Human-CAV Interaction (HCI)
Human-CAV interaction (HCI) recognises the human CAV rights holder, responds to commands and queries from humans outside the CAV, provides extended environment representation (Full World Representation) for humans to use, senses human activities during the travel and may activate other Subsystems as required by humans. See MMC-HCI Use Case and Functional Requirements WD0.5
Fig. 3 – Environment Sensing Subsystem (ESS)
ESS receives information on electromagnetic (GPS, Lidar, Radar, Visual) and acoustic (Ultrasound and Audio) carriers from its sensors, and other types of information (e.g. temperature, velocity) from AMS. These sources of information are used to create the Basic World Representation which is passed on to AMS. Information is also recorded on a local store. See MPAI-CAV Environment Sensing Subsystem (ESS)
Fig. 4 – Autonomous Motion Subsystem (AMS)
AMS is the “brain” of CAV. It runs a dialogue with HCI (e.g., to move itself to a waypoint), it receives the BWR from ESS, sends to CAVs in range the BWR and receives same from them, computes the Full World Representation (FWR) using all available information and sends commands to MAS. It records its processes.
Figure 5 – Motion Actuation Subsystem
MAS sends to ESS all necessary data from its sensors, e.g., spatial (coordinates, velocity, acceleration) and environment (e.g., temperature, humidity).

MAS receives from AMS commands to move CAV to a new state within an assigned period of time and sends feedback about the execution of the command.

The news publication plan is given below:

  1. Connected Autonomous Vehicles in MPAI
  2. Why an MPAI-CAV standard?
  3. An Introduction to the MPAI-CAV Subsystems
  4. Human-CAV interaction
  5. Environment Sensing Subsystem
  6. Autonomous Motion Subsystem
  7. Motion Actuation Subsystem

MPAI-CAVis at the level of Use Cases and Functional Requirements.

If you wish to participate in this work you have the following options

  1. Join MPAI
  2. Participate until the MPAI-CAV Functional Requirements are approved (after that only MPAI members can participate) by sending an email to the MPAI Secretariat.
  3. Keep an eye on this page.

Return to the MPAI-CAV page