This is the public page of the Connected Autonomous Vehicle (CAV) standard. See the MPAI-CAV homepage.
MPAI-CAV is an MPAI project aiming at standardising all IT components required to implement a Connected Autonomous Vehicle (CAV), i.e., a system capable of moving autonomously – save for the exceptional intervention of a human – based on the analysis of the data produced by a range of sensors exploring the environment and the information transmitted by other sources in range, e.g., other CAVs and roadside units (RSU). Standardisation of CAV components, called AI Modules (AIM) in MPAI, will cover their functionality and input/output data formats.
Continue reading for a general overview and review the MPAI-CAV Use Cases and Functional Requirements WD0.14.
Fig. 1 – The 5 MPAI-CAV subsystems | |
![]() |
MPAI-CAV addresses 4 main interacting CAV subsystems. The Human-CAV Interaction HCI gives instructions to the CAV, sensors in the Environment Sensing Subsystem ESS allow CAV to create a Basic World Representation (BWR) exchanged with CAVs in range to allow the Autonomous Motion Subsystem AMS to create the Full World Representation based on which the Motion Actuation Subsystem MAS is issued commands. Read more, more and more. |
Fig. 2 – Human-CAV Interaction (HCI) | |
![]() |
Human-CAV interaction (HCI) performs the following functions: 1) recognises the human CAV rights holder by their speech and face, 2) responds to commands and queries from humans outside the CAV, 3) instructs the Autonomous Motion Subsystem to find the route to the target destination, 4) provides extended environment representation (Full World Representation) for humans to use in the cabin, 5) senses human activities during the travel, 6) converses with humans manifesting itself as a speaking avatar whose speech and head and shoulder convey the HCI’s Personal Status and may activate other Subsystems as required by humans. Read more. |
Fig. 3 – Environment Sensing Subsystem (ESS) | |
![]() |
ESS receives information from electromagnetic (GPS, Lidar, Radar, Visual) and acoustic (Ultrasound and Audio) carriers from its sensors, and other types of information (e.g. temperature, air pressure and spatial attitude) from the Autonomous Motion Subsystem (AMS). These sources of information are used to create the Basic World Representation which is passed on to the AMS. Read more. |
Fig. 4 – Autonomous Motion Subsystem (AMS) | |
![]() |
AMS is the “brain” of CAV. Performs the following functions: 1) talks to the HCI to receive high-level instructions about CAV’s direction (e.g., to move itself to a waypoint), 2) receives the Basic World Representation (BWR) from the ESS, 3) sends to CAVs in range the BWR and receives same from them, 4) computes the Full World Representation (FWR) using all available information and 5) sends commands to the Motion Actuation Subsystem. |
Figure 5 – Motion Actuation Subsystem (MAS) | |
![]() |
MAS sends to the Environment Sensing Subsystem all useful data from its sensors, e.g., spatial attitude and environmental (e.g., temperature, humidity).
MAS receives from the Autonomous Motion Subsystem commands to move the CAV to a new state within an assigned period of time and sends feedback about the execution of the command. |
The publication plan is given below:
- Connected Autonomous Vehicles in MPAI
- Why an MPAI-CAV standard?
- An Introduction to the MPAI-CAV Subsystems
- Human-CAV interaction
- Environment Sensing Subsystem
- Autonomous Motion Subsystem
- Motion Actuation Subsystem
The MPAI-CAV project is at the level of Use Cases and Functional Requirements.
If you wish to participate in this work you have the following options:
- Join MPAI
- Participate until the MPAI-CAV Functional Requirements are approved (after that only MPAI members can participate) by sending an email to the MPAI Secretariat.
- Keep an eye on this page.
Return to the MPAI-CAV page