Connected Autonomous Vehicles (MPAI-CAV)

1. Introduction

2 Scope of the MPAI-CAV Use Cases

3 Terms and definitions

4 References

5. Use Cases

6. Functional Requirements

7. Data Privacy

8. Annexes


 

1. Introduction

2 Scope of the MPAI-CAV Use Cases

3 Terms and definitions

4 References

4.1       Normative References. 7

4.2       Informative References. 7

1 Introduction

Moving Picture, Audio and Data Coding by Artificial Intelligence (MPAI) is an international Standards Developing Organisation with the mission to develop AI-enabled data coding standards. Research has shown that data coding with AI-based technologies is generally more efficient than with existing technologies. Compression and feature-based description are notable examples of coding.

In the following, Terms beginning with a capital letter are defined in Table 3 if they are specific to MPAI-MCS Standard and to Table 17 if they are common to all MPAI Standards.

MPAI Application Standards enable the development of AI-based products, applic­ations and services. The MPAI AI Framework (AIF) Standard (MPA-AIF) [2] provides the foundation on which the technologies defined by MPAI Application Standards operate.

Figure 1 depicts the MPAI-AIF Reference Model. This Introduction only describes the basic proc­essing elements called AI Modules (AIM) which make up an AI Workflow (AIW) executed in an AI Framework (AIF).

Figure 1 – The AI Framework (AIF) Reference Model and its Components

MPAI Application Standards normatively specify:

  1. For the AIMs: the Function, and the Semantics and Formats of the input and output data but not the internal architecture, which may be based on AI or data processing, and implemented in software, hardware or hybrid software and hardware technologies.
  2. For the AIWs: the Function, the input and output data Semantics and Formats, and the Topology of the Connections between and among the AIMs.

MPAI defines Interoperability as the ability to replace an AIW or an AIM Implementation with a functionally equivalent AIW or AIM Implementation. An AIW executed in an AIF may have one of the following MPAI-defined Interoperability Levels:

  1. Interoperability Level 1, if the AIW is proprietary and composed of AIMs with proprietary functions using any proprietary or standard data Format.
  2. Interoperability Level 2, if the AIW is composed of AIMs having all their Functions, Formats and Connections specified by an MPAI Application Standard.
  3. Interoperability Level 3, if the AIW has Interoperability Level 2, and the AIW and its AIMs are certified by an MPAI-appointed Assessor to hold the attributes of Rel­iability, Robustness, Replicability and Fairness – collectively called Performance.

MPAI is the basic element of the MPAI Ecosystem [1] offering Users access to the promised benefits of AI with a guarantee of increased transparency, trust and reliability as the Interop­erability Level of an Implementation moves from 1 to 3.

2 Scope of the MPAI-CAV Use Cases

Standardisation of Connected Autonomous Vehicles CAV components will be required because of: 1) the different nature of the interacting technologies making up a CAV, 2) the sheer size of the future CAV market and 3) the need for users and regulators alike to be assured of CAV safety, reliability and explainability.

At this point in time, a traditional approach to standardisation might consider CAV standards premature and some affected industries may not even be ready yet to consider CAV  standards. CAVs, however, at best belong to an industry still being formed, expected to target the production of economic affordable units in the hundreds of millions, with components to be produced by disparate sources. A competitive market of standard components can reduce costs and make CAV confirm their promise to have a major positive impact on the environment and society.

A CAV Reference Model (RM) identifying components and their interfaces can accelerate the definition of standard components. Progression from research to standardation can unfold as a series of proposals from research suggesting components and interfaces to standardisation, and standardisation either requesting more results, or refining the results, or adopting the proposals. Eventually, industry will receive a set of specifications for standard component functions and interfaces to be implemented as best available technology allows. Implementation in products will rely, as a minimum, on the know how of those who have driven the development of the specific­ations.

Connected Autonomous Vehicles (MPAI-CAV) is an MPAI standard project, comprising sev­eral identified candidate Use Cases.

The MPAI-CAV Reference Model is subdivided in 5 subsystems:

  1. Human-CAV interaction (HCI) handling human-CAV interactions.
  2. Environment Sensing Subsystem (ESS) acquiring information from the physical environment via a variety of sensor.
  3. CAV to Everything (V2X) receiving information from external sources.
  4. Autonomous Motion Subsystem (AMS) giving commands to drive the CAV to the intended destination.
  5. Motion Actuation Subsystem (MAS) providing environment information¸ and receiving/a­ctu­ating motion commands in the physical world.

Each of the 5 subsystems is an instantiation of the MPAI-defined AI Framework (AIF).

The Reference Model identifies and describes the requirements of the data types received or generated by the AIMs in each subsystem. It allows researchers to select data, define testing setups, propose update of interfaces, conduct contests, consider the influence of external components, and subdivide workload in a way that allows unambiguous comparison of results.

Unlike previously published papers (e.g., [10]), this document has the following features:

  1. A holistic approach that includes all IT components of a CAV;
  2. AIF-AIW-AIM as the unifying model to determine the functionality and the data of all CAV components;
  3. Modules with functions and data being or already specified in other MPAI standards;
  4. Focuses on the data formats between AIMs rather than focus on the modules whose internals are not part of a standard but left to proprietary implementations.
  5. A process where research is seamlessly integrated with a subsequent standardisation process.

The purpose of this document is:

  1. To collect and describe the identified use cases.
  2. To identify the functions, and the input and output data of the AIWs that implement the Use Cases.
  3. To identify the connections of the AIMs making up the AIWs.
  4. To identify the functions, and the input and output data of the AIMs required to realise the AIWs.

Chapter 6 Provides the functional requirements that the data formats identified in points 3. and 4. above should satisfy and the connections identified in point 3 above.

3 Terms and definitions

Table 1 defines the terms used in this document. Terms are organised by the CAV Subsystems identified in Figure 3. The general MPAI Terms are defined in Table 17.

Table 1 – Definition of Terms used in this document organised by Subsystems

Legend AMS Autonomous Motion Subsystem
CAV Connected Autonomous Vehicle
ESS Environment Sensing Subsystem
HCI Human-CAV Interaction
MAS Motion Actuation Subsystem
V2X CAV to Everything

 

SubS Term Definition
AMS Command High-level instructions whose execution allows a CAV to reach a Goal.
AMS Decision Horizon The estimated time between the current State and the Goal.
AMS Full World Representation A digital representation of the Environment obtained from the fusion of all available Basic World Representations.
AMS Goal The planned State at the end of the Decision Horizon.
AMS Full World Representation A description of Environment using the CAV’s and other CAVs’ Basic World Representation.
AMS Path A sequence of Poses = (,,zi,) in the Offline Map.
AMS Pose Coordinates and orientation of the CAV in the Offline Map p = (,,z,)
AMS Route A sequence of Way Points
AMS State CAV’s Pose, Velocity and Acceleration at a given time.
AMS Traffic Rules The digital representation of the traffic rules applying to a Pose.
AMS Way Point A point given as a coordinate pair (, ), in an Offline Map
CAV Connected Autonomous Vehicle A vehicle capable to autonomously reach an assigned target by understanding human utterances, planning a route, sensing and interpreting the environment, exchanging information with other CAVs and acting on the CAV’s motion subsystem.
CAV Health The condition, e.g., mechanical, of a Subsystem or an AIM.
CAV Reference Model The collection of the following resources:

1.     AIW input and output data.

2.     Connections of the AIMs.

3.     AIMs’ input and output data.

4.     AIMs’ connections.

CAV Subsystem One of the 5 components making up the CAV.
ESS Basic World Representation A digital representation of the Environment created with infor­mation available from the CAV’s ESS and an Offline Map.
ESS Environment The portion of the world of interest to the CAV.
ESS Global Navigation Satellite System (GNSS) includes GPS, Galileo, Glonass. BeiDou, Quasi Zenith Satellite System (QZSS) and Indian Regional Navigation Satellite System (IRNSS).
ESS Inertial Measurement Unit An inertial positioning device, e.g., accelerometer, speedometer, gyroscope, odometer etc.
ESS Offline Map An offline-created map of a location and associated metadata.
HCI Command High-level instructions whose execution allows a CAV to reach a Goal.
HCI Feedback Responses or sentences autonomously generated by the CAV.
MAS
V2X

4 References

4.1       Normative References

This document references the following normative documents:

  1. Technical Specification: The Governance of the MPAI Ecosystem V1.
  2. Technical Specification: AI Framework (MPAI-AIF), MPAI document N3
  3. Technical Specification: Technical Specification: Multimodal Conversation (MPAI-MMC) V1; published at https://mpai.community/standards/resources/1.
  4. Draft Technical Specification: Context-based Audio Enhancement (MPAI-CAE) VI, to be published at https://mpai.community/standards/resources/.
  5. Universal Coded Character Set (UCS): ISO/IEC 10646; December 2020
  6. ISO/IEC 14496-10; Information technology – Coding of audio-visual objects – Part 10: Advanced Video Coding.
  7. ISO/IEC 23008-2; Information technology – High efficiency coding and media delivery in heterogeneous environments – Part 2: High Efficiency Video Coding.
  8. ISO/IEC 23094-1; Information technology – General video coding – Part 1: Essential Video Coding.

4.2       Informative References

This document references the following informative documents:

  1. SAE International Releases Updated Visual Chart for Its “Levels of Driving Automation” Standard for Self-Driving Vehicles, https://www.sae.org/news/press-room/2018/12/sae-international-releases-updated-visual-chart-for-its-%E2%80%9Clevels-of-driving-automation%E2%80%9D-standard-for-self-driving-vehicles
  2. Serban, Alexandru Constantin, Erik Poll, and Joost Visser. “A Standard Driven Software Architecture for Fully Autonomous Vehicles.” 2018 IEEE International Conference on Software Architecture Companion (ICSA-C). IEEE, 2018.
  3. ISO 8855: “Road vehicles — Vehicle dynamics and road-holding ability — Vocabulary”
  4. Rodolfo W. L. Coutinho and Azzedine Boukerche, Guidelines for the Design of Vehicular Cloud Infrastructures for Connected Autonomous Vehicles, IEEE Wireless Communications – August 2019
  5. Claudine Badue, Rânik Guidolini, Raphael Vivacqua Carneiro, Pedro Azevedo, Vinicius B. Cardoso, Avelino Forechi, Luan Jesus, Rodrigo Berriel, Thiago M. Paixão, Filipe Mutz, Lucas de Paula Veronese, Thiago Oliveira-Santos, Alberto F. De Souza; Self-driving cars: A survey; Expert Systems With Applications 165 (2021) 113816
  6. Cireşan, U. Meier, J. Masci, and J. Schmidhuber, “Multi-column deep neural network for traffic sign classification,” Neural Netw., vol.32, pp.333–338, Aug. 2012
  7. ETSI TR 103 562 V2.1.1 (2019-12), Analysis of the Collective Perception Service (CPS); Release 2.
  8. Gokulnath Thandavarayan, Miguel Sepulcre, and Javier Gozalvez; Generation of Cooperative Perception Messages for Connected and Automated Vehicles; IEEE Transactions on Vehicular Technology, Vol. 69, No. 12, December 2020
  9. CAR 2 CAR Communication Consortium, https://www.car-2-car.org/
  10. Usman Ali Khan and Sang Sun Lee; Distance-Based Resource Allocation for Vehicle-to-Pedestrian Safety Communication; https://www.mdpi.com/2079-9292/9/10/1640/pdf
  11. Pranav Kumar Singhab, Sunit Kumar Nandiac, Sukumar Nandi; A tutorial survey on vehicular communication state of the art, and future research directions; Vehicular Communications Volume 18, August 2019, 100164
  12. https://phantom.ai/assets/uploads/PAI%20Renesas%20Partnership%20Announcement%20(1).pdf
  13. Charles R., Qi Li, Yi Hao Su, Leonidas J. Guibas; PointNet++: Deep Hierarchical Feature Learning on Point Sets in a Metric Space; https://arxiv.org/pdf/1706.02413.pdf
  14. Bernard, Olivier; Bradway, David; Hansen, Hendrik H.G.; Kruizinga, Pieter; Nair, Arun; Perdios, Dimitris; Ricci, Stefano; Rindal, Ole Marius Hoel; Rodriguez-Molares, Alfonso; Stuart, Matthias Bo; The Ultrasound File Format (UFF) – First draft, Proceedings of 2018 IEEE International Ultrasonics Symposium
  15. LAS (LASer) File Format, Version 1.4 , https://www.loc.gov/preservation/digital/formats/fdd/fdd000418.shtml
  16. https://www.lesliesikos.com/pcap/
  17. https://pointclouds.org/documentation/tutorials/hdl_grabber.html
  18. https://www.mathworks.com/help/vision/ref/velodynefilereader.html
  19. Heistermann, S. Jacobi and T. Pfaff,Technical Note: An open source library for processing weather radar data (wradlib), https://hess.copernicus.org/articles/17/863/2013/hess-17-863-2013.pdf
  20. https://pro.arcgis.com/en/pro-app/latest/help/data/data-interoperability/supported-formats-with-the-data-interoperability-extension.htm
  21. CDOT and Panasonic Take First Steps to Turn I-70 into Connected Roadway, https://www.codot.gov/news/2018/july/cdot-and-panasonic-take-first-steps-to-turn-i-70-into-connected-roadway
  22. Navigation Data Standards, https://nds-association.org/
  23. Matthias Schreier; Environment Representations for Automated On-Road Vehicles; https://www.researchgate.net/publication/323105152_Environment_Representations_for_Automated_On-Road_Vehicles
  24. Toh Chai Keong, Juan Carlos Cano, Carlos-Javier Fernandez-Laguia, Pietro Manzoni; Wireless Digital Traffic Signs of the Future; IET Networks, September 2018
  25. Vladimir Hahanov, Wajeb Gharibi, Eugenia Litvinova, Svitlana Chumachenko, Arthur Ziarmand, Irina Englesi, Igor Gritsuk, Vladimir Volkov, Anastasiia Khakhanova; Cloud-Driven Traffic Monitoring and Control Based on Smart Virtual Infrastructure; 2017-03-28
  26. SharedStreets; https://sharedstreets.io/
  27. Bojarski, Mariusz, et al. “End to end learning for self-driving cars.” arXiv preprint arXiv:1604.07316 (2016).
  28. James S. Albus; The NIST Real-time Control System (RCS) An Approach to Intelligent Systems Research; https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=820528
  29. Integrated Public Alert & Warning System; https://www.fema.gov/gu/emergency-managers/practitioners/integrated-public-alert-warning-system/public
  30. OASIS Standard; Common Alerting Protocol Version 1.2; https://docs.oasis-open.org/emergency/cap/v1.2/CAP-v1.2-os.pdf
  31. Dorais G., Kortenkamp D. (2001) Designing Human-Centered Autonomous Agents. Lecture Notes in Computer Science, vol 2112. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45408-X_32
  32. https://youtu.be/nCaomvjPIok