Moving Picture, Audio and Data Coding
by Artificial Intelligence

Search results for MPAI-AIF Framework Licence

MPAI-AIF V1 Framework Licence

1        Coverage

The MPAI AI Framework (MPAI-AIF) standard as will be defined in document Nxyz of Moving Picture, Audio and Data Coding by Artificial Intelligence (MPAI).

MPAI-AIF specifies a generic execution environment possibly integrating Machine Learning, Artificial Intelligence and legacy Data Processing components implementing application areas such as

  1. Context-based Audio Enhancement (MPAI-CAE)
  2. Integrative Genomic/Sensor Analysis (MPAI-GSA)
  3. AI-Enhanced Video Coding (MPAI-EVC)
  4. Server-based Predictive Multiplayer Gaming (MPAI-SPG)
  5. Multi-Modal Conversation (MPAI-MMC)
  6. Compression and Understanding of Industrial data (MPAI-CUI)

The six application areas are expected to become MPAI standards.

2        Definitions

Term Definition
Data Any digital representation of a real or computer-generated entity, such as moving pictures, audio, point cloud, computer graphics, sensor and actu­ator. Data includes, but is not restricted to, media, manufacturing, auto­mot­ive, health and generic data.
Development Rights Licence to use MPAI-AIF Essential IPRs to develop Implementations
Enterprise Any commercial entity that develops or implements the MPAI-AIF standard
Essential IPR Any Proprietary Rights, (such as patents) without which it is not possible on technical (but not commercial) grounds, to make, sell, lease, otherwise dispose of, repair, use or operate Implementations without infringing those Proprietary Rights
Framework Licence A document, developed in compliance with the gener­ally accepted principles of competition law, which contains the conditions of use of the Licence without the values, e.g., currency, percent, dates etc.
Implementation A hardware and/or software reification of the MPAI-AIF standard serving the needs of a professional or consumer user directly or through a service
Implementation Rights Licence to reify the MPAI-AIF standard
Licence This Framework Licence to which values, e.g., currency, percent, dates etc., related to a specific Intellectual Property will be added. In this Framework Licence, the word Licence will be used as singular. However, multiple Licences from different IPR holders may be issued
Profile A particular subset of the technologies that are used in MPAI-AIF standard and, where applicable, the classes, subsets, options and parameters relevant to the subset

3        Conditions of use of the Licence

  1. The Licence will be in compliance with generally accepted principles of competition law and the MPAI Statutes
  2. The Licence will cover all of Licensor’s claims to Essential IPR practiced by a Licencee of the MPAI-AIF standard.
  3. The Licence will cover Development Rights and Implementation Rights
  4. The Licence will apply to a baseline MPAI-AIF profile and to other profiles containing additional technologies
  5. Access to Essential IPRs of the MPAI-AIF standard will be granted in a non-discriminatory fashion.
  6. The scope of the Licence will be subject to legal, bias, ethical and moral limitations
  7. Royalties will apply to Implementations that are based on the MPAI-AIF standard
  8. Royalties will not be based on the computational time nor on the number of API calls
  9. Royalties will apply on a worldwide basis
  10. Royalties will apply to any Implementation
  11. An MPAI-AIF Implementation may use other IPR to extend the MPAI-AIF Implementation or to provide additional functionalities
  12. The Licence may be granted free of charge for particular uses if so decided by the licensors
  13. The Licences will specify
    1. a threshold below which a Licence will be granted free of charge and/or
    2. a grace period during which a Licence will be granted free of charge and/or
    3. an annual in-compliance royalty cap applying to total royalties due on worldwide rev­enues for a single Enterprise
  14. A preference will be expressed on the entity that should administer the patent pool of holders of Patents Essential to the MPAI-AIF standard
  15. The total cost of the Licences issued by IPR holders will be in line with the total cost of the licences for similar technologies standardised in the context of Standard Development Organisations
  16. The total cost of the Licences will take into account the value on the market of the AI Framework technology Standardised by MPAI.

MPAI-AIF V2 Framework Licence

The MPAI-AIF V2 Framework Licence is also available as available as a Word document.

1        Coverage

This Framework Licence applies to the MPAI AI Framework Version 2 (MPAI-AIF) Technical Specification as it will be defined in document Nxyz of Moving Picture, Audio and Data Coding by Artificial Intelligence (MPAI). All contributors to MPAI-AIF V2 shall confirm in writing their intention to make available a Licence for their Essential IPR based on the Conditions of use of the Licence in Section 3.

2        Definitions

Term Definition
Conforming Implementation An implementation that has passed the Conformance Testing process of the MAI-AIF V2 Technical Specification.
Data Any digital representation of a real or computer-generated entity, such as moving pictures, audio, point cloud, computer graphics, sensor, and actuator. Data includes, but is not restricted to, media, manufacturing, automotive, health and generic data.
Development Rights Licence to use MPAI-AIF V2 Essential IPRs to develop Implementations.
Enterprise Any entity contributing to the development of or implementing the MPAI-AIF V2 Technical Specification.
Essential IPR Any Proprietary Rights, (such as patents) without which it is not possible on technical (but not commercial) grounds, to make, sell, lease, otherwise dispose of, repair, use or operate Implementations without infringing those Proprietary Rights
Framework Licence A document, developed in compliance with the generally accepted principles of competition law, which contains the conditions of use of the Licence without the values, e.g., currency, percent, dates, etc.
General Availability The result of the release of a product to the general public. When a product reaches General Availability, it becomes available through the company’s general sales channel — as opposed to a limited release, or beta version, used primarily for testing and user feedback purposes.
Implementation A hardware and/or software reification of the MPAI-AIF V2 Technical Specification serving the needs of a professional or consumer user directly or through a service
Implementation Rights Licence to reify the MPAI-AIF V2 Technical Specification
Licence The Framework Licence to which values, e.g., currency, percent, dates etc., related to a Standard Essential IPR will be added. In the Framework Licence, the word Licence will be used as singular. However, multiple Licences from different IPR holders may be issued.
Profile A particular subset of the technologies that are used in MPAI-AIF V2 Technical Specification and, where applicable, the classes, subsets, options, and parameters relevant to the subset
Standard Essential IPR Essential IPR for the MPAI-AIF V2 Technical Specification standardised by MPAI.
Technical Specification The document specifying how to make a Conforming Implementation of MPAI-AIF V2.
Trial A release of a product or service to a limited number of participants in the Trial.

3        Conditions of use of the Licence

The Standard Essential IPR holders commit themselves to issue a Licence with the following conditions:

  1. The Licence will be in compliance with generally accepted principles of competition law and the MPAI Statutes
  2. The Licence will cover all of Licensor’s claims to Essential IPR practised by a Licensee of the MPAI-AIF V2 Technical Specification.
  3. The Licence will cover Development Rights and Implementation Rights.
  4. The Licence for Development and Implementation Rights, to the extent it is developed and implemented only for the purpose of evaluation or demo solutions or technical trials, will be free of charge.
  5. The Licence will apply to a baseline MPAI-AIF V2 profile and to other profiles containing additional technologies.
  6. Access to Essential IPRs of the MPAI-AIF V2 Technical Specification will be granted in a non-discriminatory fashion.
  7. The scope of the Licence will be subject to legal, bias, ethical and moral limitations.
  8. Royalties will apply to Implementations that are based on the MPAI-AIF V2 Technical Specification.
  9. Royalties will apply on a worldwide basis.
  10. Royalties will apply to any Implementation, with the exclusion of the types of implementations specified in clause 4.
  11. An MPAI-AIF V2 Implementation may use other IPR to extend the MPAI-AIF V2 Implementation or to provide additional functionalities.
  12. The Licence may be granted free of charge for particular uses if so decided by the licensors.
  13. A licence free of charge for a limited time and a limited amount of forfeited royalties will be granted on request.
  14. A preference will be expressed on the entity that should administer the patent pool of holders of Patents Essential to the MPAI-AIF V2 Technical Specification.
  15. The Licence will be issued before commercial implementations of the MPAI-AIF V2 Technical Specification become available on the market. Commercial implementation implies General Availability to any users and does not include trials.
  16. The total cost of the Licences issued by IPR holders will be in line with the total cost of the Licences for similar technologies standardised in the context of Standard Development Organisations.
  17. The total cost of the Licences will take into account the value on the market of the Standard Essential IPR.

MPAI-AIF References

<- Definitions     Go to ToC       Architecture ->

1       Normative references

MPAI-AIF normatively references the following documents:

  1. MPAI; The MPAI Statutes; https://mpai.community/statutes/
  2. MPAI; The MPAI Patent Policy; https://mpai.community/about/the-mpai-patent-policy/.
  3. MPAI; Technical Specification: Governance of the MPAI Ecosystem; https://mpai.community/standards/mpai-gme/
  4. GIT protocol, https://git-scm.com/book/en/v2/Git-on-the-Server-The-Protocols.
  5. ZIP format, https://pkware.cachefly.net/webdocs/casestudies/APPNOTE.TXT.
  6. Date and Time in the Internet: Timestamps; IETF RFC 3339; July 2002.
  7. Uniform Resource Identifiers (URI): Generic Syntax, IETF RFC 2396, August 1998.
  8. The JavaScript Object Notation (JSON) Data Interchange Format; https://datatracker.ietf.org/doc/html/rfc8259; IETF rfc8259; December 2017
  9. JSON Schema; https://json-schema.org/.
  10. BNF Notation for syntax; https://www.w3.org/Notation.html
  11. MPAI; The MPAI Ontology; https://mpai.community/standards/mpai-aif/mpai-ontology/
  12. Framework Licence of the Artificial Intelligence Framework Technical Specification (MPAI-AIF); https://mpai.community/standards/mpai-aif/framework-licence/
  13. Bormann, C. and P. Hoffman, Concise Binary Object Representation (CBOR), December 2020. https://rfc-editor.org/info/std94
  14. Schaad, J., CBOR Object Signing and Encryption (COSE): Structures and Process, August 2022. https://rfc-editor.org/info/std96
  15. IETF Entity Attestation Token (EAT), Draft. https://datatracker.ietf.org/doc/draft-ietf-rats-eat
  16. IEEE, 1619-2018 — IEEE Standard for Cryptographic Protection of Data on Block-Oriented Storage Devices, January 2019. https://ieeexplore.ieee.org/servlet/opac?punumber=8637986
  17. IETF, The MD5 Message-Digest Algorithm, April 1992. https://tools.ietf.org/html/rfc1321.html
  18. [RFC6979] IETF, Deterministic Usage of the Digital Signature Algorithm (DSA) and Elliptic Curve Digital Signature Algorithm (ECDSA), August 2013. https://tools.ietf.org/html/rfc6979.html
  19. [RFC7539] IETF, ChaCha20 and Poly1305 for IETF Protocols, May 2015. https://tools.ietf.org/html/rfc7539.html
  20. [RFC7919] IETF, Negotiated Finite Field Diffie-Hellman Ephemeral Parameters for Transport Layer Security (TLS), August 2016. https://tools.ietf.org/html/rfc7919.html
  21. [RFC8017] IETF, PKCS #1: RSA Cryptography Specifications Version 2.2, November 2016. https://tools.ietf.org/html/rfc8017.html
  22. [RFC8032] IRTF, Edwards-Curve Digital Signature Algorithm (EdDSA), January 2017. https://tools.ietf.org/html/rfc8032.html
  23. Standards for Efficient Cryptography, SEC 1: Elliptic Curve Cryptography, May 2009. https://www.secg.org/sec1-v2.pdf
  24. NIST, FIPS Publication 202: SHA-3 Standard: Permutation-Based Hash and Extendable-Output Functions, August 2015.https://doi.org/10.6028/NIST.FIPS.202
  25. NIST, NIST Special Publication 800-38A: Recommendation for Block Cipher Modes of Operation: Methods and Techniques, December 2001. https://doi.org/10.6028/NIST.SP.800-38A
  26. NIST, NIST Special Publication 800-38D: Recommendation for Block Cipher Modes of Operation: Galois/Counter Mode (GCM) and GMAC, November 2007. https://doi.org/10.6028/NIST.SP.800-38D

2       Informative references

  1. Message Passing Interface (MPI), https://www.mcs.anl.gov/research/projects/mpi/
  2. Rose, Scott; Borchert, Oliver; Mitchell, Stu; Connelly, Sean; “Zero Trust Architecture”; https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-207.pdf
  3. MPAI Technical Specification: Context-based Audio Enhancement (MPAI-CAE) V2; https://mpai.community/standards/mpai-cae/.
  4. MPAI Technical Specification: Connected Autonomous Vehicle – Architecture (MPAI-CAV) V1; https://mpai.community/standards/mpai-cav/.
  5. MPAI Technical Specification: Compression and Understanding of Industrial Data (MPAI-CUI) V1.1; https://mpai.community/standards/mpai-cui/.
  6. MPAI Technical Specification: Multimodal Conversation (MPAI-MMC) V2; https://mpai.community/standards/mpai-mmc/.
  7. MPAI Technical Specification: Neural Network Watermarking (MPAI-MMC) V1; https://mpai.community/standards/mpai-nnw/.
  8. MPAI Technical Specification: Portable Avatar Format (MPAI-PAF) V1; https://mpai.community/standards/mpai-paf/.
  9. Wang, J. Gao, M. Zhang, S. Wang, G. Chen, T. K. Ng, B. C. Ooi, J. Shao, and M. Reyad, “Rafiki: machine learning as an analytics service system,” Proceedings of the VLDB Endowment, vol. 12, no. 2, pp. 128–140, 2018.
  10. Lee, A. Scolari, B.-G. Chun, M. D. Santambrogio, M. Weimer, and M. Interlandi; PRETZEL: Opening the black box of machine learning prediction serving systems; in 13th USENIX Symposium on Operating Systems Design and Implementation (OSDI18), pp. 611–626, 2018.
  11. NET [ONLINE]; https://dotnet.microsoft.com/apps/machinelearning-ai/ml-dotnet.
  12. Crankshaw, X. Wang, G. Zhou, M. J. Franklin, J. E. Gonzalez, and I. Stoica; Clipper: A low-latency online prediction serving system; in NSDI, pp. 613–627, 2017.
  13. Zhao, M. Talasila, G. Jacobson, C. Borcea, S. A. Aftab, and J. F. Murray; Packaging and sharing machine learning models via the acumos ai open platform; in 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), pp. 841–846, IEEE, 2018.
  14. Apache Prediction I/O; https://predictionio.apache.org/.
  15. Sculley, G. Holt, D. Golovin, E. Davydov, T. Phillips, D. Ebner, V. Chaudhary, M. Young, J. Crespo, D. Dennison; Hidden technical debt in Machine learning systems Share; on NIPS’15: Proceedings of the 28th International Conference on Neural Information Processing Systems – Volume 2; December 2015 Pages 2503–2511
  16. Arm; “PSA Certified Crypto API 1.1,” IHI 0086, issue 2,23/03/2022, https://arm-software.github.io/psa-api/crypto/1.1/
  17. Arm; “PSA Certified Secure Storage API 1.0,” IHI 0087, issue 2, 23/03/2023, https://arm-software.github.io/psa-api/storage/1.0/
  18. Arm; “PSA Certified Attestation API 1.0,” IHI 0085, issue 3, 17/10/2022, https://arm-software.github.io/psa-api/attestation/1.0/

<- Definitions     Go to ToC       Architecture ->

 


Technical Specification – AI Framework (MPAI-AIF) WD for Community Comments

This document is a working draft of Version 2 of Technical Specification: AI Framework (MPAI-AIF). It is published with a request for Community Comments. Comments should be sent to the MPAI Secretariat by 2023/09/24T23:59 UTC.

An online presentation of the AI Framework V2 WD will be held on September 04 at 08 and 15 UTC

 

WARNING

 

Use of the technologies described in this Technical Specification may infringe patents, copyrights or intellectual property rights of MPAI Members or non-members.

 

MPAI and its Members accept no responsibility whatsoever for damages or liability, direct or consequential, which may result from the use of this Technical Specification.

 

Readers are invited to review Annex 2 – Notices and Disclaimers.

 

 

 

 

 

© Copyright MPAI 2021-2023.  All rights reserved.

AI Framework

V2 (Under development)

 

1        Introduction (Informative) 5

2        Scope of Standard. 6

3        Terms and Definitions. 7

4        References. 9

4.1        Normative references. 9

4.2        Informative references. 10

5        Architecture of the AI Framework. 11

5.1        AI Framework Components. 11

5.1.1        Components for Basic Functionalities. 11

5.1.2        Components for Security Functionalities. 12

5.2        AI Framework Implementations. 13

5.3        AIMs. 13

5.3.1        Implementation types. 13

5.3.2        Combination. 14

5.3.3        Hardware-software compatibility. 14

5.3.4        Actual implementations. 14

6        Metadata. 15

6.1        Communication channels and their data types. 15

6.1.1        Type system.. 15

6.1.2        Mapping the type to buffer contents. 16

6.2        AIF Metadata. 16

6.3        AIW/AIM Metadata. 18

7        Common features of MPAI-AIF API. 23

7.1        General 23

7.2        Conventions. 23

7.2.1        API types. 23

7.2.2        Return codes. 24

7.2.3        High-priority Messages. 24

8        Basic API. 25

8.1        Store API called by Controller 25

8.1.1        Get and parse archive. 25

8.2        Controller API called by User Agent 25

8.2.1        General 25

8.2.2        Start/Pause/Resume/Stop Messages to other AIWs. 25

8.2.3        Inquire about state of AIWs and AIMs. 26

8.2.4        Management of Shared and AIM Storage for AIWs. 26

8.2.5        Communication management 26

8.2.6        Resource allocation management 27

8.3        Controller API called by AIMs. 27

8.3.1        General 27

8.3.2        Resource allocation management 27

8.3.3        Register/deregister AIMs with the Controller 28

8.3.4        Send Start/Pause/Resume/Stop Messages to other AIMs. 28

8.3.5        Register Connections between AIMs. 29

8.3.6        Using Ports. 29

8.3.7        Operations on messages. 30

8.3.8        Functions specific to machine learning. 32

8.3.9        Controller API called by Controller 32

9        Security API. 33

9.1        Data characterization structure. 33

9.2        API called by User Agent 34

9.3        API to access Secure Storage. 34

9.3.1        User Agent initialises Secure Storage API. 34

9.3.2        User Agent writes Secure Storage API. 34

9.3.3        User Agent reads Secure Storage API. 34

9.3.4        User Agent gets info from Secure Storage API. 34

9.3.5        User Agent deletes a p_data in Secure Storage API. 34

9.4        API to access Attestation. 34

9.5        API to access cryptographic functions. 35

9.5.1        Hashing. 35

9.5.2        Key management 35

9.5.3        Key exchange. 37

9.5.4        Message Authentication Code. 37

9.5.5        Cyphers. 38

9.5.6        Authenticated encryption with associated data (AEAD) 38

9.5.7        Signature. 39

9.5.8        Asymmetric Encryption. 39

9.6        API to enable secure communication. 40

10     Profiles. 40

10.1     Basic Profile. 40

10.2     Secure Profile. 40

11     Examples (Informative) 40

11.1     AIF Implementations. 40

11.1.1     Resource-constrained implementation. 40

11.1.2     Non-resource-constrained implementation. 40

11.2     Examples of types. 41

11.3     Examples of Metadata. 41

11.3.1     Metadata of Enhanced Audioconference Experience AIF. 41

11.3.2     Metadata of Enhanced Audioconference Experience AIW… 42

11.3.3     Metadata of CAE-EAE Analysis Transform AIM… 49

11.3.4     Metadata of Enhanced Audioconference Experience AIW… 50

11.3.5     Metadata of CAE-EAE Analysis Transform AIM… 57

11.3.6     Metadata of CAE-EAE Sound Field Description AIM… 58

11.3.7     Metadata of CAE-EAE Speech Detection and Separation AIM… 59

11.3.8     Metadata of CAE-EAE Noise Cancellation AIM… 60

11.3.9     Metadata of CAE-EAE Synthesis Transform AIM… 61

11.3.10   Metadata of CAE-EAE Packager AIM… 63

Annex 1      MPAI-wide terms and definitions. 65

Annex 2      Notices and Disclaimers Concerning MPAI Standards (Informative) 68

Annex 3      The Governance of the MPAI Ecosystem (Informative) 70

Annex 4      Applications (Informative) 72

Annex 5      Patent declarations. 73

Annex 6      Threat Models. 74

Annex 7      Use Cases. 75

1        Secure communication via Network Security (TLS) 75

1.1        Secure Storage. 75

1.2        Network Credentials (authentication) 75

1.3        Attestation. 75

1.4        MPAI Store Provisioning. 75

2        Workflow.. 76

 

1          Introduction (Informative)

In recent years, Artificial Intelligence (AI) and related technologies have been introduced in a broad range of applications, have started affecting the life of millions of people and are expected to do so even more in the future. As digital media standards have positively influenced industry and billions of people, so AI-based data coding standards are expected to have a similar positive impact. Indeed, research has shown that data coding with AI-based technologies is generally more efficient than with existing technologies for, e.g., compression and feature-based description.

 

However, some AI technologies may carry inherent risks, e.g., in terms of bias toward some classes of users. Therefore, the need for standardisation is more important and urgent than ever.

 

The international, unaffiliated, not-for-profit MPAI – Moving Picture, Audio and Data Coding by Artificial Intelligence Standards Developing Organisation has the mission to develop AI-enabled data coding standards. MPAI Application Standards enable the development of AI-based products, applications, and services.

 

As a rule, MPAI standards include four documents: Technical Specification, Reference Software Specifications, Conformance Testing Specifications, and Performance Assessment Specifications. Sometimes Technical Reports are produced to provide informative guidance in specific areas for which the development of standards in premature.

 

Performance Assessment Specifications include standard operating procedures to enable users of MPAI Implementations to make informed decision about their applicability based on the notion of Performance, defined as a set of attributes characterising a reliable and trustworthy implementation.

In the following, Terms beginning with a capital letter are defined in Table 1 if they are specific to this Standard and in Table 2 if they are common to all MPAI Standards.

 

In general, MPAI Application Standards are defined as aggregations – called AI Workflows (AIW) – of processing elements – called AI Modules (AIM) – executed in an AI Framework (AIF). MPAI defines Interoperability as the ability to replace an AIW or an AIM Implementation with a functionally equivalent Implementation.

 

MPAI also defines 3 Interoperability Levels of an AIF that executes an AIW. The AIW and its AIMs may have 3 Levels:

Level 1 – Implementer-specific and satisfying the MPAI-AIF Standard.

Level 2 – Specified by an MPAI Application Standard.

Level 3 – Specified by an MPAI Application Standard and certified by a Performance Assessor.

 

MPAI offers Users access to the promised benefits of AI with a guarantee of increased transparency, trust and reliability as the Interoperability Level of an Implementation moves from 1 to 3.

 

Figure 1 depicts the MPAI-AIF Reference Model under which Implementations of MPAI Applic­ation Standards and user-defined MPAI-AIF Conforming applications operate.

MPAI Application Standards normatively specify the Syntax and Semantics of the input and output data and the Function of the AIW and the AIMs, and the Connections between and among the AIMs of an AIW.

 

Figure 1 – The AI Framework (AIF) Reference Model and its Components

 

In particular, an AIM is defined by its Function and data, but not by its internal architecture, which may be based on AI or data processing, and implemented in software, hardware or hybrid software and hardware technologies.

 

MPAI Standards are designed to enable a User to obtain, via standard protocols, an Implementation of an AIW and of the set of corresponding AIMs and execute it in an AIF Implementation. The MPAI Store in Figure 1 is an entity from which Implementations are downloaded. MPAI Standards assume that the AIF, AIW, and AIM Implementations may have been developed by independent implementers. A necessary condition for this to be possible, is that any AIF, AIW, and AIM implementations be uniquely identified. MPAI has appointed an ImplementerID Registration Authority (IIDRA) to assign unique ImplementerIDs (IID) to Implementers.[1]

 

A necessary condition to make possible the operations described in the paragraph above is the existence of an ecosystem composed of Conformance Testers, Performance Assessors, an instance of the IIDRA and of the MPAI Store. Reference [26] provides an informative example of such ecosystem.

 

The chapters and the annexes of this Technical Specification are Normative unless they are labelled as Informative.

 

2          Scope of Standard

The MPAI AI Framework (MPAI-AIF) Technical Specification specifies the architecture, interfaces, protocols, and Application Programming Interfaces (API) of an AI Framework specially designed for execution of AI-based implementations, but also suitable for mixed AI and traditional data processing workflows.

MPAI-AIF possesses the following main features in two instances:

Basic functionalities:

  • Independent of the Operating System.
  • Component-based modular architecture with specified interfaces.
  • Interfaces encapsulate Components to abstract them from the development environment.
  • Interface with the Store enabling access to validated Components.
  • Component can be Implemented as:
    • Software only, from MCUs to HPC.
    • Hardware only.
    • Hybrid hardware-software.
  • Component system features are:
    • Execution in local and distributed Zero-Trust architectures [28].
    • Possibility to interact with other Implementations operating in proximity.
    • Direct support to Machine Learning functionalities.
  • The AIF can download an AIW whose identifier has been specified by the User Agent or by a configuration parameter.

 

Secure functionalities:

  • The Framework provides access to the following Trusted Services:
    • A selected range of cyphering algorithms.
    • A basic attestation function.
    • Secure storage (RAM, internal/external flash, or internal/external/remote disk).
    • Certificate-based secure communication.
  • The AIF can execute only one AIW containing only one AIM. The AIM has the following features:
    • The AIM may be a Composite AIM.
    • The AIMs of the Composite AIM cannot access the Secure API.
  • The AIF Trusted Services may rely on hardware and OS security features already existing in the hardware and software of the environment in which the AIF is implemented.

 

The current version of the MPAI-AIF Technical Specification has been developed by the MPAI AI Framework Development Committee (AIF-DC). Future Versions may revise and/or extend the Scope of the Standard.

 

3          Terms and Definitions

The terms used in this standard whose first letter is capital have the meaning defined in Table 1.

 

Table 1Table of terms and definitions

 

Term Definition
Access Static or slowly changing data that are required by an application such as domain knowledge data, data models, etc.
AI Framework (AIF) The environment where AIWs are executed.
AI Module (AIM) A processing element receiving AIM-specific Inputs and producing AIM-specific Outputs according to according to its Function. An AIM may be an aggregation of AIMs. AIMs operate in the Trusted Zone.
AI Workflow (AIW) A structured aggregation of AIMs implementing a Use Case receiving AIM-specific inputs and producing AIM-specific outputs according to its Function. AIWs operate in the Trusted Zone.
AIF Metadata The data set describing the capabilities of an AIF set by the AIF Implem­enter.
AIM Metadata The data set describing the capabilities of an AIM set by the AIM Implementer.
AIM Storage A Component to store data of individual AIMs. An AIM may only access its own data. The AIM Storage is part of the Trusted Zone.
AIW Metadata The data set describing the capabilities of an AIW set by the AIW Im­plementer.
Channel A physical or logical connection between an output Port of an AIM and an input Port of an AIM. The term “connection” is also used as synonymous. Channels are part of the Trusted Zone.
Communication The infrastructure that implements message passing between AIMs. Communication operates in the Trusted Zone.
Component One of the 9 AIF elements: Access, AI Module, AI Workflow, Commun­ication, Controller, AIM Storage, Shared Storage, Store, and User Agent.
Composite AIM An AIM aggregating more than one AIM.
Controller A Component that manages and controls the AIMs in the AIWs, so that they execute in the correct order and at the time when they are needed. The Controller operates in the Trusted Zone.
Data Type An instance of the Data Types defined by 6.1.1.
Device A hardware and/or software entity running at least one instance of an AIF.
Event An occurrence acted on by an Implementation.
External Port An input or output Port simulating communication with an external Controller.
Knowledge Base Structured and/or unstructured information made accessible to AIMs via MPAI-specified interfaces.
Message A sequence of Records.
MPAI Ontology A dynamic collection of terms with a defined semantics managed by MPAI.
MPAI Server A remote machine executing one or more AIMs.
Remote Port A Port number associated with a specific remote AIM.
Store The repository of Implementations.
Port A physical or logical communication interface of an AIM.
Record Data with a specified Format.
Resource policy The set of conditions under which specific actions may be applied.
Security Abstraction Layer (SAL) The set of Trusted Services that provide security functionalities to AIF.
Shared Storage A Component to store data shared among AIMs. The Shared Storage is part of the Trusted Zone.
Status The set of parameters characterising a Component.
Structure A composition of Records
Swarm Element An AIF in a in a proximity-based scenario.
Time Base The protocol specifying how Components can access timing information. The Time Base is part of the Trusted Zone.
Topology The set of Channels connecting AIMs in an AIW.
Trusted Zone An environment that contains only trusted objects, i.e., object that do not require further authentication.
User Agent The Component interfacing the user with an AIF through the Controller
Zero Trust A cybersecurity model primarily focused on data and service protection that assumes no implicit trust [28].
Security Abstraction Layer A layer acting as a bridge between the AIMs and the Control on one side, and the security functions.

 

4          References

4.1        Normative references

MPAI-AIF normatively references the following documents:

  1. GIT protocol, https://git-scm.com/book/en/v2/Git-on-the-Server-The-Protocols.
  2. ZIP format, https://pkware.cachefly.net/webdocs/casestudies/APPNOTE.TXT.
  3. Date and Time in the Internet: Timestamps; IETF RFC 3339; July 2002.
  4. Uniform Resource Identifiers (URI): Generic Syntax, IETF RFC 2396, August 1998.
  5. The JavaScript Object Notation (JSON) Data Interchange Format; https://datatracker.ietf.org/doc/html/rfc8259; IETF rfc8259; December 2017
  6. JSON Schema; https://json-schema.org/.
  7. BNF Notation for syntax; https://www.w3.org/Notation.html
  8. MPAI; The MPAI Ontology; https://mpai.community/standards/mpai-aif/mpai-ontology/
  9. MPAI; The MPAI Statutes; https://mpai.community/statutes/
  10. MPAI; The MPAI Patent Policy; https://mpai.community/about/the-mpai-patent-policy/.
  11. Framework Licence of the Artificial Intelligence Framework Technical Specification (MPAI-AIF); https://mpai.community/standards/mpai-aif/framework-licence/
  12. Bormann, C. and P. Hoffman, Concise Binary Object Representation (CBOR), December 2020. https://rfc-editor.org/info/std94
  13. Schaad, J., CBOR Object Signing and Encryption (COSE): Structures and Process, August 2022. https://rfc-editor.org/info/std96
  14. IETF Entity Attestation Token (EAT), Draft. https://datatracker.ietf.org/doc/draft-ietf-rats-eat
  15. IEEE, 1619-2018 — IEEE Standard for Cryptographic Protection of Data on

Block-Oriented Storage Devices, January 2019. https://ieeexplore.ieee.org/servlet/opac?punumber=8637986

  1. IETF, The MD5 Message-Digest Algorithm, April 1992. https://tools.ietf.org/html/rfc1321.html
  2. [RFC6979] IETF, Deterministic Usage of the Digital Signature Algorithm (DSA) and Elliptic Curve Digital Signature Algorithm (ECDSA), August 2013. https://tools.ietf.org/html/rfc6979.html
  3. [RFC7539] IETF, ChaCha20 and Poly1305 for IETF Protocols, May 2015. https://tools.ietf.org/html/rfc7539.html
  4. [RFC7919] IETF, Negotiated Finite Field Diffie-Hellman Ephemeral Parameters for Transport Layer Security (TLS), August 2016. https://tools.ietf.org/html/rfc7919.html
  5. [RFC8017] IETF, PKCS #1: RSA Cryptography Specifications Version 2.2, November 2016. https://tools.ietf.org/html/rfc8017.html
  6. [RFC8032] IRTF, Edwards-Curve Digital Signature Algorithm (EdDSA), January 2017. https://tools.ietf.org/html/rfc8032.html
  7. Standards for Efficient Cryptography, SEC 1: Elliptic Curve Cryptography, May 2009. https://www.secg.org/sec1-v2.pdf
  8. NIST, FIPS Publication 202: SHA-3 Standard: Permutation-Based Hash and Extendable-Output Functions, August 2015.https://doi.org/10.6028/NIST.FIPS.202
  9. NIST, NIST Special Publication 800-38A: Recommendation for Block Cipher Modes of Operation: Methods and Techniques, December 2001. https://doi.org/10.6028/NIST.SP.800-38A
  10. NIST, NIST Special Publication 800-38D: Recommendation for Block Cipher Modes of Operation: Galois/Counter Mode (GCM) and GMAC, November 2007. https://doi.org/10.6028/NIST.SP.800-38D

4.2        Informative references

  1. Technical Specification: The Governance of the MPAI Ecosystem V1, 2021; https://mpai.community/standards/mpai-gme/
  2. Message Passing Interface (MPI), https://www.mcs.anl.gov/research/projects/mpi/
  3. Rose, Scott; Borchert, Oliver; Mitchell, Stu; Connelly, Sean; “Zero Trust Architecture”; https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-207.pdf
  4. MPAI Technical Specification; Context-based Audio Enhancement (MPAI-CAE) V1; https://mpai.community/standards/resources/.
  5. MPAI Technical Specification; Compression and Understanding of Industrial Data (MPAI-CUI) V1; https://mpai.community/standards/resources/.
  6. MPAI Technical Specification; Multimodal Conversation (MPAI-MMC) V1; https://mpai.community/standards/resources/.
  7. Wang, J. Gao, M. Zhang, S. Wang, G. Chen, T. K. Ng, B. C. Ooi, J. Shao, and M. Reyad, “Rafiki: machine learning as an analytics service system,” Proceedings of the VLDB Endowment, vol. 12, no. 2, pp. 128–140, 2018.
  8. Lee, A. Scolari, B.-G. Chun, M. D. Santambrogio, M. Weimer, and M. Interlandi; PRETZEL: Opening the black box of machine learning prediction serving systems; in 13th USENIX Symposium on Operating Systems Design and Implementation (OSDI18), pp. 611–626, 2018.
  9. NET [ONLINE]; https://dotnet.microsoft.com/apps/machinelearning-ai/ml-dotnet.
  10. Crankshaw, X. Wang, G. Zhou, M. J. Franklin, J. E. Gonzalez, and I. Stoica; Clipper: A low-latency online prediction serving system; in NSDI, pp. 613–627, 2017.
  11. Zhao, M. Talasila, G. Jacobson, C. Borcea, S. A. Aftab, and J. F. Murray; Packaging and sharing machine learning models via the acumos ai open platform; in 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), pp. 841–846, IEEE, 2018.
  12. Apache Prediction I/O; https://predictionio.apache.org/.
  13. Sculley, G. Holt, D. Golovin, E. Davydov, T. Phillips, D. Ebner, V. Chaudhary, M. Young, J. Crespo, D. Dennison; Hidden technical debt in Machine learning systems Share; on NIPS’15: Proceedings of the 28th International Conference on Neural Information Processing Systems – Volume 2; December 2015 Pages 2503–2511
  14. Arm, “PSA Certified Crypto API 1.1,” IHI 0086, issue 2,23/03/2022, https://arm-software.github.io/psa-api/crypto/1.1/
  15. Arm, “PSA Certified Secure Storage API 1.0,” IHI 0087, issue 2, 23/03/2023,

https://arm-software.github.io/psa-api/storage/1.0/

  1. Arm, “PSA Certified Attestation API 1.0,” IHI 0085, issue 3, 17/10/2022, https://arm-software.github.io/psa-api/attestation/1.0/

 

5          Architecture of the AI Framework

5.1        AI Framework Components

This MPAI-AIF Version adds a Secure Profile with Security functionalities on top of the Basic Profile of Version 1.1 with the following restrictions:

  • There is only one AIW containing only one AIM – which may be a Composite AIM.
  • The AIM implementer guarantees the security of the AIM by calling the security API.
  • The AIF application developer cannot access securely the Composite AIM internals.

5.1.1        Components for Basic Functionalities

Figure 1 specifies the MPAI-AIF Components supported by Version 1.1.

 

Figure 2 – The MPAI-AIF V1 Reference Model

The specific functions of the Components are:

  1. Controller:
    • Provides basic functionalities such as scheduling, communication between AIMs and with AIF Components such as AIM Storage and Global Storage.
    • Acts as a resource manager, according to instructions given by the User through the User Agent.
    • Can interact by default to all the AIMs in a given AIF.
    • Activates/suspends/resumes/deactivates AIWs based on User’s or other inputs.
    • May supports complex application scenarios by balancing load and resources.
    • Accesses the MPAI Store APIs to download AIWs and AIMs.
    • Exposes three APIs:
      • AIM APIs enable AIMs to communicate with it (register themselves, communicate and access the rest of the AIF environment). An AIW is an AIM with additional metadata. Therefore, an AIW uses the same AIM API.
      • User APIs enable User or other Controllers to perform high-level tasks (e.g., switch the Controller on and off, give inputs to the AIW through the Controller).
      • Controller-to-Controller API enables interactions among Controllers.
    • May run an AIW on different computing platforms and may run more than one AIW.
    • May communicate with other Controllers.
  2. Communication: connects the AIF Components via Events or Channels connecting an output Port of an AIM with an input Port of another AIM. Communication has the following characteristics:
    • The Communication Component is turned on jointly with the Controller.
    • The Communication Component needs not be persistent.
    • Channels are unicast and may be physical or logical.
    • Messages are transmitted via Channels. They are composed of sequences of Records and may be of two types:
      • High-Priority Messages expressed as up to 16-bit integers.
      • Normal-Priority Messages expressed as MPAI-AIF defined types (1.1).
        • Messages may be communicated through Channels or Events.
  1. AI Module (AIM): a data processing element with a specified Function receiving AIM-specific inputs and producing AIM-specific outputs having the following characteristics:
    • Communicates with other Components through Ports or Events.
    • Includes at least one input Port and one output Port.
    • May incorporate other AIMs.
    • May be hot-pluggable, and dynamically register and disconnect itself on the fly.
    • May be executed:
      • Locally, e.g., it encapsulates hardware physically accessible to the Controller.
      • On different computing platforms, e.g., in the cloud or on swarms of drones, and encapsulates communication with a remote Controller.
  1. AI Workflow (AIW): an organised aggregation of AIMs receiving AIM-specific inputs and producing AIM-specific outputs according to its Function implementing a Use Case that is either proprietary or specified by an MPAI Application Standard.
  2. Global Storage: stores data shared by AIMs.
  3. AIM Storage: stores data of individual AIMs.
  4. User Agent: interfaces the User with an AIF through the Controller.
  5. Access: provides access to static or slowly changing data that is required by AIMs such as domain knowledge data, data models, etc.
  6. MPAI Store: stores Implementations for users to download by secure protocols.

5.1.2        Components for Security Functionalities

The AIF Components have the following features:

  1. The AIW
    • The AIMs in the AIW trust each other and communicate without special security concerns.
    • Communication among AIMs in the Composite AIM is non-secure.
  2. The Controller
    • Communicates securely with the MPAI-Store and the User Agent (Authentication, Attestation, and Encryption).
    • Accesses Communication, Global Storage, Access and MPAI Store via Trusted Services API.
    • Is split in two parts:
      • Secure Controller accesses Secure Communication and Secure Storage.
      • Non-Secure Controller can access the non-secure parts of the AIF.
    • Interfaces with the User Agent in the area where non-secure code is executed.
    • Interface with the Composite AIM in the area where secure code is executed,
  3. AIM/AIW Storage
    • Secure Storage functionality is provided through key exchange.
    • Non-secure functionality is provided without reference to secure API calls.
  4. The AIW/AIMs call the Secure Abstraction Layer via API.
  5. The AIMs of a Composite AIM shall run on the same computing platform.

 

Figure 3 specifies the MPAI-AIF Components operating in the secure environment created by the Secure Abstraction Layer..

 

Figure 3 – The MPAI-AIF V2 Reference Model

5.2        AI Framework Implementations

MPAI-AIF enables a wide variety of Implementations:

  1. AIF Implementations can be tailored to different execution environments, e.g., High-Performance Computing systems or resource-constrained computing boards. For instance, the Controller might be a process on a HPC system or a library function on a computing board.
  2. There is always a Controller even if the AIF is a lightweight Implementation.
  3. The API may have different MPAI-defined Profiles to allow for Implementations:
    1. To run on different computing platforms and different programming languages.
    2. To be based on different hardware and resources available.
  4. AIMs may be Implemented in hardware, software and mixed-hardware and software.
  5. Interoperability between AIMs is ensured by the way communication between AIMs is defined, irrespective of whether they are implemented in hardware or software.
  6. Use of Ports and Channels ensures that compatible AIM Ports may be connected together irrespective of the AIMs’ implementation technology.
  7. Message generation and Event management is implementation independent.

5.3        AIMs

5.3.1        Implementation types

AIMs can be implemented in either hardware or software keeping the same interfaces independent of the implementation technology. However, the nature of the AIM might impose constraints on the specific values of certain API parameters and different Profiles may impose different constraints. For instance, Events (easy to accommodate in software but less so in hardware); and persistent Channels (easy to make in hardware, less so in software).

While software-software and hardware-hardware connections are homogeneous, a hybrid hardware-software scenario is inherently heterogeneous and requires the specification of additional communication protocols, which are used to wrap the hardware part and connect it to software. A list of such protocols is provided by the MPAI Ontology [8].

Examples of supported architectures are:

  • CPU-based devices running an operating system.
  • Memory-mapped devices (FPGAs, GPUs, TPUs) which are presented as accelerators.
  • Cloud-based frameworks.
  • Naked hardware devices (i.e., IP in FPGAs) that communicate through hardware Ports.
  • Encapsulated blocks of a hardware design (i.e., IP in FPGAs) that communicate through a memory-mapped bus. In this case, the Metadata associated with the AIM (see 3) shall also specify the low-level communication protocol used by the Ports.

5.3.2        Combination

MPAI-AIF supports the following ways of combining AIMs:

  • Software AIMs connected to other software AIMs resulting in a software AIM.
  • Non-encapsulated hardware blocks connected to other non-encapsulated hardware blocks, resulting in a larger, non-encapsulated hardware AIM
  • Encapsulated hardware blocks connected to either other encapsulated hardware blocks or other software blocks, resulting in a larger software AIM.

Connection between a non-encapsulated hardware AIM and a software AIM is not supported as in such a case direct communication between the AIMs cannot be defined in any meaningful way.

5.3.3        Hardware-software compatibility

To achieve communication among AIMs irrespective of their implementation technology, the requirements considered in the following two cases should be satisfied:

  1. Hardware AIM to Hardware AIM: Each named type in a Structure is transmitted as a separate channel. Vector types are implemented as two channels, one transmitting the size and the second transmitting the data.
  2. All other combinations: Fill out a Structure by recursively traversing the definition (breadth-first). Sub-fields are laid down according to their type, in little-endian order.

5.3.4        Actual implementations

5.3.4.1       Hardware

Metadata ensures that hardware blocks can be directly connected to other hardware/software blocks, provided the specification platforms for the two blocks have compatible interfaces, i.e., they have compatible Ports and Channels.

5.3.4.2       Software

Software Implementations shall ensure that Communication among different constituent AIMs, and with other AIMs outside the block, is performed correctly.

In addition, AIM software Implementations shall contain a number of well-defined steps so as to ensure that the Controller is correctly initialised and remains in a consistent internal state, i.e.:

  1. Code registering the different AIMs used by the AIW. The registration operation specifies where the AIMs will be executed, either locally or remotely. The AIM Implementations are archives downloaded from the Store containing source code, binary code and hardware designs executed on a local machine/HPC cluster/MPC machine or a remote machine.
  2. Code starting/stopping the AIMs.
  3. Code registering the input/output Ports for the AIM.
  4. Code instantiating unicast channels between AIM Ports belonging to AIMs used by the AIW, and connections from/to the AIM being defined to/from remote AIMs.
  5. Registering Ports and connecting them may result in a number of steps performed by the Controller – some suitable data structure (including, for instance, data buffers) will be allocated for each Port or Channel, in order to support the functions specified by the Controller API called by the AIM (3).
  6. Explicitly write/read data to/from, any of the existing Ports.
  7. In general, arbitrary functionality can be added to a software AIM. For instance, depending on the AIM Function, one would typically link libraries that allow a GPU or FPGA to be managed through Direct Memory Access (DMA), or link and use high-level libraries (e.g., Tensor­Flow) that implement AI-related functionality.
  8. The API implementation depends on the architecture the Implementation is designed for.

6          Metadata

Metadata specifies static properties pertaining to the interaction between:

  1. A Controller and its hosting hardware.
  2. An AIW and its hosting Controller.
  3. An AIM and the AIW it belongs to.

Metadata specified in the following Sections is represented in JSON Schema.

6.1        Communication channels and their data types

This Section specifies how Metadata pertaining to a communication Channel is defined.

6.1.1        Type system

The data interchange happening through buffers involves the exchange of structured data.

Message data types exchanged through Ports and communication Channels are defined by the following Backus–Naur Form (BNF) specification [7]. Words in bold typeface are keywords; capitalised words such as NAME are tokens.

 

fifo_type :=
| /* The empty type */
| base_type NAME
recursive_type :=
| recursive_base_type NAME
base_type :=| toplevel_base_type

| recursive_base_type

| ( base_type )

toplevel_base_type :=
| array_type

| toplevel_struct_type

| toplevel_variant_type

array_type :=

| recursive_base_type []
toplevel_struct_type :=
| { one_or_more_fifo_types_struct }

one_or_more_fifo_types_struct :=
| fifo_type
| fifo_type ; one_or_more_fifo_types_struct

toplevel_variant_type :=
| { one_or_more_fifo_types_variant }

one_or_more_fifo_types_variant :=
| fifo_type | fifo_type
| fifo_type | one_or_more_fifo_types_variant
recursive_base_type :=

| signed_type

| unsigned_type

| float_type

| struct_type

| variant_type

signed_type :=

| int8
| int16
| int32
| int64
unsigned_type :=

| uint8 | byte
| uint16
| uint32
| uint64

float_type :=
| float32
| float64
struct_type :=
| { one_or_more_recursive_types_struct }

one_or_more_recursive_types_struct :=
| recursive_type
| recursive_type ; one_or_more_recursive_types_struct

variant_type :=
| { one_or_more_recursive_types_variant }

one_or_more_recursive_types_variant :=
| recursive_type | recursive_type
| recursive_type | one_or_more_recursive_types_variant

Valid types for FIFOs are those defined by the production fifo_type.

Although this syntax allows to specify types having a fixed length, the general record type written to, or read from, the Port will not have a fixed length. If an AIM implemented in hardware receives data from an AIM implemented in software the data format should be harmonised with the limitations of the hardware AIM.

6.1.2        Mapping the type to buffer contents

The Type definition allows to derive an automated way of filling and transmitting buffers both for hardware and software implementations. Data structures are turned into low-level memory buffers, filled out by recursively traversing the definition (breadth-first). Sub-fields are laid down according to their type, in little-endian order.

For instance, a definition for transmitting a video frame through a FIFO might be:

{int32 frameNumber; int16 x; int16 y; byte[] frame} frame_t

and the corresponding memory layout would be

[32 bits: frameNumber | 16 bits: x | 16 bits: y | 32 bits: size(frame) | 8*size(frame) bits: frame].

API functions are provided to parse the content of raw memory buffers in a platform- and implementation-independent fashion (see Subsection 8.3.7).

6.2        AIF Metadata

AIF Metadata is specified in terms of JSON Schema [6] definition.

 

{

“$schema”: “https://json-schema.org/draft/2020-12/schema”,

“$id”: “https://mpai.community/standards/MPAI-AIF/V1/AIF-metadata.schema.json”,

“title”: “MPAI-AIF V1 AIF metadata”,

“type”: “object”,

“properties”: {

“ImplementerID”: {

“description”: “String assigned by IIDRA”,

“type”: “string”

},

“Version”: {

“description”: “Provided by the Implementer. Replaced by ‘*’ in technical specifications”,

“type”: “string”

},

“APIProfile”: {

“description”: “Provided in Chapter 10. Selected by the Implementer”,

“type”: “string”,

“enum”: [“Basic”, “Secure”]

},

“Resources”: {

“ComputingPolicies”: {

“description”: “A set of policies describing computing resources made available to AIWs”,

“type”: “array”,

“items”: {

“description”: “A policy describing computing resources made available to AIWs”,

“type”: “object”,

“properties”: {

“Name”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},

“Minimum”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},

“Maximum”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

}

},

“required”: [“Name”]

},

“Storage”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},

“Controller”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},

“Extension”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

}

}

},

“Services”: {

“Communication”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},

“Trusted Services”: {

“Communication”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},

“Authentication”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},

“Encryption”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},

“Attestation”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},

“Extension”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

}

},

“TimeBase”: {

“description”: “A protocol providing a time base. If absent, timestamps are in-terpreted according to the host time clock (absolute time with the appropriate time-scale conversion)”,

“type”: “string”,

“enum”: [“NTP”, “RTP”, “RTCP”]

}

},

“required”: [“ImplementerID”, “Version”, “Authentication”]

}

}

6.3        AIW/AIM Metadata

AIM Metadata specifies static, abstract properties pertaining to one or more AIM implementations, and how the AIM will interact with the Controller.

AIW/AIM Metadata is specified in terms of JSON Schema [6] definition.

 

{

“$schema”: “https://json-schema.org/draft/2020-12/schema”,

“$id”: “https://mpai.community/standards/MPAI-AIF/V1/AIW-AIM-metadata.schema.json”,

“id”: “#root”,

“title”: “MPAI-AIF V1 AIW/AIM metadata”,

“type”: “object”,

“properties”: {

“Identifier”: {

“id”: “#identifier”,

“description”: “Information uniquely identifying an AIW/AIM implementation”,

“type”: “object”,

“properties”: {

“ImplementerID”: {

“description”: “String assigned by IIDRA”,

“type”: “string”

},

“Specification”: {

“oneOf”: [

{

“description”: “An AIW/AIM defined by an MPAI standard”,

“type”: “object”,

“properties”: {

“Standard”: {

“description”: “Defined by the Standard”,

“type”: “string”

},

“AIW”: {

“description”: “Defined by the Standard”,

“type”: “string”

},

“AIM”: {

“description”: “Same as AIW when the Metadata being defined de-scribes the AIW, otherwise the name of the AIM as defined by the Standard”,

“type”: “string”

},

“Version”: {

“description”: “Defined by the Standard”,

“type”: “string”

},

“Profile”: {

“description”: “Provided by MPAI. Selected by the Implementer”,

“type”: “array”,

“items”: {

“type”: “string”,

“enum”: [ “Base”, “Main”, “High” ]

}

}

},

“required”: [ “Standard”, “AIW”, “AIM”, “Version” ]

},

{

“description”: “An AIW/AIM defined by an Implementer”,

“type”: “object”,

“properties”: {

“Name”: {

“description”: “Provided by the Implementer”,

“type”: “string”

},

“Version”: {

“description”: “Provided by the Implementer”,

“type”: “string”

}

},

“required”: [ “Name”, “Version” ]

}

]

}

},

“required”: [ “ImplementerID”, “Specification” ]

},

“APIProfile”: {

“description”: “Provided by MPAI. Selected by the Implementer”,

“type”: “string”,

“enum”: [ “Basic”, “Secure” ]

},

“Description”: {

“description”: “Free text describing the AIM”,

“type”: “string”

},

“Types”: {

“description”: “A list of shorthands for Channel data types, defined according to 6.1.1”,

“type”: “array”,

“items”: {

“description”: “A shorthand for a Channel data type, defined according to 6.1.1”,

“type”: “object”,

“properties”: {

“Name”: {

“description”: “The unique shorthand used for a Channel data type”,

“type”: “string”

},

“Type”: {

“description”: “A Channel data type, defined according to 6.1.1”,

“type”: “string”

}

},

“required”: [ “Name”, “Type” ]

}

},

“Ports”: {

“description”: “A list of AIM Ports”,

“type”: “array”,

“items”: {

“description”: “A Port, i.e., a physical or logical interface through which the AIM communicates”,

“type”: “object”,

“properties”: {

“Name”: {

“description”: “Implementer-defined name”,

“type”: “string”

},

“Direction”: {

“description”: “The direction of the communication flow”,

“type”: “string”,

“enum”: [ “OutputInput”, “InputOutput” ]

},

“RecordType”: {

“description”: “Port data type defined either in the dictionary Types, or according to Section 6.1.1”,

“type”: “string”

},

“Technology”: {

“description”: “Whether the Port is implemented in hardware or software”,

“type”: “string”,

“enum”: [ “Hardware”, “Software” ]

},

“Protocol”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},

“IsRemote”: {

“description”: “Boolean specifying whether the port is remote”,

“type”: “boolean”

}

},

“required”: [ “Name”, “Direction”, “RecordType”, “Technology”, “Protocol”, “IsRemote” ]

}

},

“SubAIMs”: {

“description”: “A list of AIMs in terms of which the current AIM is defined”,

“type”: “array”,

“items”: {

“description”: “One of the AIMs in terms of which the current AIM is defined”,

“type”: “object”,

“properties”: {

“Name”: {

“description”: “A unique shorthand for the AIM in terms of which the cur-rent AIM is defined”,

“type”: “string”

},

“Identifier”: {

“$ref”: “#identifier”

}

},

“required”: [ “Name”, “Identifier” ]

}

},

“Topology”: {

“description”: “A list of Channels connecting one Output to one Input Port”,

“type”: “array”,

“items”: {

“description”: “A Channel connecting one Output to one Input Port”,

“type”: “object”,

“properties”: {

“Output”: {

“id”: “#portID”,

“description”: “A Port identifier”,

“type”: “object”,

“properties”: {

“AIMName”: {

“description”: “The unique shorthand for a SubAIM”,

“type”: “string”

},

“PortName”: {

“description”: “The unique shorthand for one of the SubAIM Ports”,

“type”: “string”

}

},

“required”: [ “AIMName”, “PortName” ]

},

“Input”: {

“$ref”: “#portID”

}

},

“required”: [ “Output”, “Input” ]

}

},

“Implementations”: {

“description”: “A list of Implementations for the AIM being defined”,

“type”: “array”,

“items”: {

“description”: “An Implementation for the AIM being defined”,

“type”: “object”,

“properties”: {

“BinaryName”: {

“description”: “Specifies an entry in the archive containing the Implementation downloaded from the Store”,

“type”: “string”

},

“Architecture”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},

“OperatingSystem”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},

“Version”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},          “Source”: {

“description”: “Where the AIM Implementation should be found”,

“type”: “string”,

“enum”: [ “AIMStorage”, “MPAIStore” ]

},

“Destination”: {

“description”: “If empty, the Implementation is executed locally. Other-wise, the string shall be a valid URI of an MPAI Server”,

“type”: “string”

}

},

“required”: [ “BinaryName”, “Architecture”, “OperatingSystem”, “Version”, “Source”, “Destination” ]

}

},

“ResourcePolicies”: {

“description”: “A set of policies describing computing resources needed by the AIW/AIF being defined”,

“type”: “array”,

“items”: {

“description”: “A policy describing computing resources needed by the AIW/AIF being defined”,

“type”: “object”,

“properties”: {

“Name”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},

“Minimum”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},

“Maximum”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”

},

“Request”: {

“description”: “An entry in the MPAI-specified Ontology”,

“type”: “string”          }

},

“required”: [ “Name” ]

}

},

“Documentation”: {

“definition”: “A list of references to documents specifying information relevant to the design, implementation and usage of the AIM being defined”,

“type”: “array”,

“items”: {

“description”: “A reference to a document specifying information relevant to the design, implementation and usage of the AIM being defined”,

“type”: “object”,

“properties”: {

“Type”: {

“description”: “The type of the document”,

“type”: “string”,

“enum”: [ “Specification”, “Manual”, “Tutorial”, “Video” ]

},

“URI”: {

“description”: “A valid URI for the document”,

“type”: “string”

}

}

}

}

},

“required”: [ “Identifier”, “Ports”, “SubAIMs”, “Topology”, “Implementations” ]

}

 

7          Common features of MPAI-AIF API

7.1        General

This Chapter specifies the API of the software library supporting this Technical Specification.

MPAI-AIF specifies the following API:

  1. Store API called by a Controller.
  2. Controller API called by a User Agent.
  3. Controller API called by an AIM.
  4. Controller API called by other Controllers.

7.2        Conventions

The API is written in a C-like fashion. However, the specification should be meant as a definition for a general programming language.

Note that namespaces for modules, ports and communication channels (strings belonging to which are indicated in the next sections with names such as module_name, port_name, and channel_name, respectively) are all independent.

7.2.1        API types

We assume that the implementation defines several types, as follows:

message_t the type of messages being passed through communication ports and channels
parser_t the type of parsed message datatypes (a.k.a. “the high-level protocol”)
error_t the type of return code defined in 7.2.2.

The actual types are opaque, and their exact definition is left to the Implementer. The only meaningful way to operate on library types with defined results is by using library functions.

On the other hand, the type of AIM Implementations, module_t, is always defined as:

typedef error_t *(module_t)()

across all implementations, in order to ensure cross-compatibility.

Types such as void, size_t, char, int, float are regular C types.

7.2.2        Return codes

Valid return codes are:

Code Numeric value
MPAI_AIM_ALIVE 1
MPAI_AIM_DEAD 2
MPAI_AIF_OK 0

 

Valid error codes are:

Code Semantic value
MPAI_ERROR A generic error code
MPAI_ERROR_MEM_ALLOC Memory allocation error
MPAI_ERROR_MODULE_NOT_FOUND The operation requested of a module cannot be executed since the module has not been found
MPAI_ERROR_INIT The AIW cannot be initialied
MPAI_ERROR_TERM The AIW cannot be properly terminated
MPAI_ERROR_MODULE_CREATION_FAILED A new AIM cannot be created
MPAI_ERROR_PORT_CREATION_FAILED A new AIM Port cannot be created
MPAI_ERROR_CHANNEL_CREATION_FAILED A new Channel between AIMs could not be created.
MPAI_ERROR_WRITE A generic message writing error
MPAI_ERROR_TOO_MANY_PENDING_MESSAGES A message writing operation failed because there are too many pending messages waiting to be delivered
MPAI_ERROR_PORT_NOT_FOUND One or both ports of a connection has (or have) been removed
MPAI_ERROR_READ A generic message reading error
MPAI_ERROR_OP_FAILED

MPAI_ERROR_EXTERNAL_CHANNEL_CREATION_FAILED

The requested operation failed

A new Channel between Controllers could not be created.

7.2.3        High-priority Messages

 

Code Numeric value
MPAI_AIM_SIGNAL_START 1
MPAI_AIM_SIGNAL_STOP 2
MPAI_AIM_SIGNAL_RESUME 3
MPAI_AIM_SIGNAL_PAUSE 4

 

8          Basic API

8.1        Store API called by Controller

It is assumed that all the communication between the Controller and the Store occur via https protocol. Thus, the APIs reported refer to the http secure protocol functions (i.e. GET, POST, etc). The Store supports the GIT protocol [1].

The Controller implements the functions relative to the file retrieval as described in 8.1.1.

8.1.1        Get and parse archive

Get and parse an archive from the Store.

8.1.1.1       MPAI_AIFS_GetAndParseArchive

error_t MPAI_AIFS_GetAndParseArchive(const char* filename)

The default file format is tar.gz. Options are tar.gz, tar.bz2, tbz, tbz2, tb2, bz2, tar, and zip. For example, specifying archive.zip would send an archive in ZIP format [2]. The archive shall include one AIW Metadata file and one or more binary files. The parsing of JSON Metadata and the creation of the corresponding data structure is left to the Implementer.

All archives downloaded from the Store shall not leave the Trusted Zone.

8.2        Controller API called by User Agent

8.2.1        General

This section specifies functions executed by the User Agent when interacting with the Controller. In particular:

  1. Initialise all the Components of the AIF.
  2. Start/Stop/Suspend/Resume AIWs.
  3. Manage Resource Allocation.

8.2.1.1       MPAI_AIFU_Controller_Initialize

error_t MPAI_AIFU_Controller_Initialize()

This function, called by the User Agent, switches on and initialies the Controller, in particular the Communication Component.

8.2.1.2       MPAI_AIFU_Controller_Destroy

error_t MPAI_AIFU_Controller_Destroy()

This function, called by the User Agent, switches off the Controller, after data structures related to running AIWs have been disposed of.

8.2.2        Start/Pause/Resume/Stop Messages to other AIWs

These functions can be used by the User Agent to send messages from the Controller to AIWs.

Errors encountered while transmitting/receiving these Messages are non-recoverable – i.e., they terminate the entire AIW. AIWs can communicate with other AIWs and the Controller uses this API to Start/Pause/Resume/Stop the AIWs.

8.2.2.1       MPAI_AIFU_AIW_Start

error_t MPAI_AIFU_AIW_Start(const char* name, int* AIW_ID)

This function, called by the User Agent, registers with the Controller and starts an instance of the AIW named name. The AIW Metadata for name shall have been previously parsed. The AIW ID is returned in the variable AIW_ID. If the operation succeeds, it has immediate effect.

8.2.2.2       MPAI_AIFU_AIW_Pause

error_t MPAI_AIFU_AIW_Pause(int AIW_ID)

With this function the User Agent asks the Controller to pause the AIW with ID AIW_ID. If the operation succeeds, it has immediate effect.

8.2.2.3       MPAI_AIFU_AIW_Resume

error_t MPAI_AIFU_AIW_Resume(int AIW_ID)

With this function the User Agent asks the Controller to resume the AIW with ID AIW_ID. If the operation succeeds, it has immediate effect.

8.2.2.4       MPAI_AIFU_AIW_Stop

error_t MPAI_AIFU_AIW_Stop(int AIW_ID)

This function, called by the User Agent, deregisters and stops the AIW with ID AIW_ID from the Controller. If the operation succeeds, it has immediate effect.

8.2.3        Inquire about state of AIWs and AIMs

8.2.3.1       MPAI_AIFU_AIM_GetStatus

error_t MPAI_AIFU_AIM_GetStatus(int AIW_ID, const char* name, int* status)

With this function the User Agent inquires about the current status of the AIM named name belonging to AIW with ID AIW_ID. The status is returned in status. Admissible values are: MPAI_AIM_ALIVE, MPAI_AIM_DEAD.

8.2.4        Management of Shared and AIM Storage for AIWs

8.2.4.1       MPAI_AIFU_SharedStorage_Init

error_t MPAI_AIFU_SharedStorage_init(int AIW_ID)

With this function the User Agent initialises the Shared Storage interface for the AIW with ID AIW_ID.

8.2.4.2       MPAI_AIFU_ AIMStorage_Init

error_t MPAI_AIFU_ AIMStorage_init(int AIM_ID)

With this function the User Agent initialises the AIM Storage interface for the AIW with ID AIW_ID.

8.2.5        Communication management

Communication takes place with Messages communicated via Events or Ports and Channels. Their actual implementation and signal type depends on the MPAI-AIF Implementation (and hence on the specific platform, operating system and programming language the Implementation is developed for). Events are defined AIF wide while Ports, Channels and Messages are specific to the AIM and thus part of the AIM API.

8.2.5.1       MPAI_AIFU_Communication_Event

error_t MPAI_AIFU_Communication_Event(const char* event)

With this function the User Agent initialises the event handling for Event named event.

8.2.6        Resource allocation management

8.2.6.1       MPAI_AIFU_Resource_GetGlobal

error_t MPAI_AIFU_Resource_GetGlobal(const char* key, const char* min_value, const char* max_value, const char* requested_value)

With this function the User Agent interrogates the resource allocation for one AIF Metadata entry. why not numerical types for min max requested value?

8.2.6.2       MPAI_AIFU_Resource_SetGlobal

error_t MPAI_AIFU_Resource_SetGlobal(const char* key, const char* min_value, const char* max_value, const char* requested_value)

With this function the User Agent initialises the resource allocation for one AIF Metadata entry.

8.2.6.3       MPAI_AIFU_Resource_GetAIW

error_t MPAI_AIFU_Resource_GetAIW(int AIW_ID, const char* key, const char* min_value, const char* max_value, const char* requested_value)

With this function the User Agent interrogates the resource allocation for one AIM Metadata entry for the AIW with AIW ID AIW_ID.

8.2.6.4       MPAI_AIFU_Resource_SetAIW

error_t MPAI_AIFU_Resource_SetAIW(int AIW_ID, const char* key, const char* min_value, const char* max_value, const char* requested_value)

With this function the User Agent interrogates the resource allocation for one AIM Metadata entry for the AIW with AIW ID AIW_ID.

8.3        Controller API called by AIMs

8.3.1        General

The following API have been defined in Version 1.1. They specify how AIWs:

  1. Define the topology and connections of AIMs in the AIW.
  2. Define the Time base.
  3. Define the Resource Policy.

8.3.2        Resource allocation management

8.3.2.1       MPAI_AIFM_Resource_GetGlobal

error_t MPAI_AIFM_Resource_GetGlobal(const char* key, const char* min_value, const char* max_value, const char* requested_value)

With this function the AIM interrogates the resource allocation for one AIF Metadata entry.

8.3.2.2       MPAI_AIFM_Resource_SetGlobal

error_t MPAI_AIFM_Resource_SetGlobal(const char* key, const char* min_value, const char* max_value, const char* requested_value)

With this function the AIM initialises the resource allocation for one AIF Metadata entry.

8.3.2.3       MPAI_AIFM_Resource_GetAIW

error_t MPAI_AIFM_Resource_GetAIW(int AIW_ID, const char* key, const char* min_value, const char* max_value, const char* requested_value)

With this function the AIM interrogates the resource allocation for one AIM Metadata entry for the AIW with AIW ID AIW_ID.

8.3.2.4       MPAI_AIFM_Resource_SetAIW

error_t MPAI_AIFM_Resource_SetAIW(int AIW_ID, const char* key, const char* min_value, const char* max_value, const char* requested_value)

With this function the AIM interrogates the resource allocation for one AIM Metadata entry for the AIW with AIW ID AIW_ID.

8.3.3        Register/deregister AIMs with the Controller

8.3.3.1       MPAI_AIFM_AIM_Register_Local

error_t MPAI_AIFM_AIM_Register_Local(const char* name)

With this function the AIM registers the AIM named name with the Controller. The AIM shall be defined in the AIM Metadata. An Implementation that can be run on the Controller shall have been downloaded from the Store together with the Metadata or be available in the AIM Storage after having been downloaded from the Store together with the Metadata.

8.3.3.2       MPAI_AIFM_AIM_Register_Remote

error_t MPAI_AIFM_AIM_Register_Remote(const char* name, const char* uri)

With this function the AIM registers the AIM named name with the Controller. The AIM shall be defined in the AIM Metadata. An implementation that can be run on the Controller shall have been downloaded from the Store together with the Metadata or be available locally. The AIM will be run remotely on the MPAI Server identified by uri.

8.3.3.3       MPAI_AIFM_AIM_Deregister

error_t MPAI_AIFM_AIM_Deregister(const char* name)

The AIW deregisters the AIM named name from the Controller.

8.3.4        Send Start/Pause/Resume/Stop Messages to other AIMs

AIMs can send Messages to AIMs defined in its Metadata.

Errors encountered while transmitting/receiving these Messages are non-recoverable – i.e., they terminate the entire AIM. AIMs can communicate with other AIMs and the Controller uses this API to Start/Pause/Resume/Stop the AIMs.

8.3.4.1       MPAI_AIFM_AIM_Start

error_t MPAI_AIFM_AIM_Start(const char* name)

With this function the AIM asks the Controller to start the AIM named name. If the operation succeeds, it has immediate effect.

8.3.4.2       MPAI_AIFM_AIM_Pause

error_t MPAI_AIFM_AIM_Pause(const char* name)

With this function the AIM asks the Controller to pause the AIM named name. If the operation succeeds, it has immediate effect.

8.3.4.3       MPAI_AIFM_AIM_Resume

error_t MPAI_AIFM_AIM_Resume(const char* name)

With this function the AIM asks the Controller to resume the AIM named name. If the operation succeeds, it has immediate effect.

8.3.4.4       MPAI_AIFM_AIM_Stop

error_t MPAI_AIFM_AIM_Stop(const char* name)

With this function the AIM asks the Controller to stop the AIM named name. If the operation succeeds, it has immediate effect.

8.3.4.5       MPAI_AIFM_AIM_EventHandler

error_t MPAI_AIFM_AIM_EventHandler(const char* name)

The AIF creates EventHandler for the AIW with given name name. If the operation succeeds, it has immediate effect.

8.3.5        Register Connections between AIMs

8.3.5.1       MPAI_AIFM_Channel_Create

error_t
MPAI_AIFM_Channel_Create(const char* name, const char* out_AIM_name, const char* out_port_name, const char* in_AIM_name, const char* in_port_name)

With this function the AIM asks the Controller to create a new interconnecting channel between an output port and an input port. AIM and port names are specified with the name used when constructed.

8.3.5.2       MPAI_AIFM_Channel_Destroy

error_t MPAI_AIFM_Channel_Destroy(const char* name)

With this function the AIM asks the Controller to destroy the channel with name name. This API Call closes all Ports related to the Channel.

8.3.6        Using Ports

8.3.6.1       MPAI_AIFM_Port_Output_Read

message_t* MPAI_AIFM_Port_Output_Read(
const char* AIM_name, const char* port_name)

This function reads a message from the Port identified by (AIM_name,port_name). The read is blocking. Hence, in order to avoid deadlocks, the Implementation should first probe the Port with MPAI_AIF_Port_Probe. It returns a copy of the original Message.

8.3.6.2       MPAI_AIFM_Port_Input_Write

error_t MPAI_AIFM_Port_Input_Write(
const char* AIM_name, const char* port_name, message_t* message)

This function writes a message message to the Port identified by (AIM_name,port_name). The write is blocking. Hence, in order to avoid deadlocks the Implementation should first probe the Port with MPAI_AIF_Port_Probe. The Message being transmitted shall remain available until the function returns, or the behaviour will be undefined.

8.3.6.3       MPAI_AIFM_Port_Reset

error_t MPAI_AIFM_Port_Reset(const char* AIM_name, const char* port_name)

This function resets an input or output Port identified by (AIM_name,port_name) by deleting all the pending Messages associated with it.

8.3.6.4       MPAI_AIFM_Port_CountPendingMessages

size_t MPAI_AIFM_Port_CountPendingMessages(
const char* AIM_name, const char* port_name)

This function returns the number of pending messages on a input or output Port identified by (AIM_name,port_name).

8.3.6.5       MPAI_AIFM_Port_Probe

error_t MPAI_AIFM_Port_Probe(const char* port_name, message_t* message)

This function returns MPAI_AIF_OK if either the Port is a FIFO input port and an AIM can write to it, or the Port is a FIFO output Port and data is available to be read from it.

8.3.6.6       MPAI_AIFM_Port_Select

int MPAI_AIFM_Port_Output_Select(
const char* AIM_name_1,const char* port_name_1,…)

Given a list of output Ports, this function returns the index of one Port for which data has become available in the meantime. The call is blocking to address potential race conditions.

8.3.7        Operations on messages

All implementations shall provide a common Message passing functionality which is abstracted by the following functions.

8.3.7.1       MPAI_AIFM_Message_Copy

message_t* MPAI_AIFM_Message_Copy(message_t* message)

This function makes a copy of a Message structure message.

8.3.7.2       MPAI_AIFM_Message_Delete

message_t* MPAI_AIFM_Message_Delete(message_t* message)

This function deletes a Message message and its allocated memory. The format of each Message passing through a Channel is defined by the Metadata for that Channel.

8.3.7.3       MPAI_AIFM_Message_GetBuffer

void* MPAI_AIFM_Message_GetBuffer(message_t* message)

This function gets access to the low-level memory buffer associated with a message structure message.

8.3.7.4       MPAI_AIFM_Message_GetBufferLength

size_t MPAI_AIFM_Message_GetBufferLength(message_t* message)

This function gets the size in bits of the low-level memory buffer associated with a message structure message.

8.3.7.5       MPAI_AIFM_Message_Parse

parser_t* MPAI_AIFM_Message_Parse (const char* type)

This function creates a parsed representation of the data type defined in type according to the Metadata syntax defined in Subsection 6.1.1 Type system, to facilitate the successive parsing of raw memory buffers associated with message structures (see functions below).

8.3.7.6       MPAI_AIFM_Message_Parse_Get_StructField

void* MPAI_AIFM_Message_Parse_Get_StructField(
parser_t* parser, void* buffer, const char* field_name)

This function assumes that the low-level memory buffer buffer contains data of type struct_type whose complete parsed type definition (specified according to the metadata syntax defined in Subsection 6.1.1 Type system) can be found in parser. This function fetches the element of the struct_type named field_name, and return it in a freshly allocated low-level memory buffer. If a element with such name does not exist, return NULL.

8.3.7.7       MPAI_AIFM_Message_Parse_Get_VariantType

void* MPAI_AIFM_Message_Parse_Get_VariantType(
parser_t* parser, void* buffer, const char* type_name)

This function assumes that the low-level memory buffer buffer contains data of type variant_type whose complete parsed type definition (specified according to the Metadata syntax defined in Chapter 0, Type system) can be found in parser. Fetch the member of the variant_type named field_name, and return it in a freshly allocated low-level memory buffer. If a element with such name does not exist, return NULL.

8.3.7.8       MPAI_AIFM_Message_Parse_Get_ArrayLength

int MPAI_AIFM_Message_Parse_Get_ArrayLength(parser_t* parser, void* buffer)

This function assumes that the low-level memory buffer buffer contains data of type array_type whose complete parsed type definition (specified according to the Metadata syntax defined in Chapter 0, Type system) can be found in parser. Retrieve the length of such an array. If the buffer does not contain an array, return -1.

8.3.7.9       MPAI_AIFM_Message_Parse_Get_ArrayField

void* MPAI_AIFM_Message_Parse_Get_ArrayField(
parser_t* parser, void* buffer, const int field_num)

This function assumes that the low-level memory buffer buffer contains data of type array_type whose complete parsed type definition (specified according to the metadata syntax defined in Chapter 0, Type system) can be found in parser. Fetch the element of the array_type named field_num, and return it in a freshly allocated low-level memory buffer. If such element does not exist, return NULL.

8.3.7.10   MPAI_AIFM_Message_Parse_Delete

void MPAI_AIFM_Message_Parse_Delete(parser_t* parser)

This function deletes the parsed representation of a data type defined by parser, and deallocates all memory associated to it.

8.3.8        Functions specific to machine learning

The two key functionalities supported by the Framework are reliable update of AIMs with Machine Learning functionality and hooks for Explainability.

8.3.8.1       Support for model update

The following API supports AIM ML model update. Such update occurs via the Store by using the Store specific APIs or via Shared (SharedStorage) or AIM-specific (AIMStorage) storage by using the specified APIs.

error* MPAI_AIFM_Model_Update(const char* model_name)

The URI model_name points to the updated model. In some cases, such update needs to happen in highly available way so as not to impact the operation of the system. How this is effected is left to the Implementer.

8.3.8.2       Support for model drift

With this function the Controller detects possible degradation in ML operation caused by the characteristics of input data being significantly different from those used in training.

float MPAI_AIFM_Model_Drift(const char* name)

8.3.9        Controller API called by Controller

This Section specifies functions used by an AIM to Communicate through a Remote Port with an AIM running on another Controller. The local and remote AIMs shall belong to the same type of AIW.

8.3.9.1       MPAI_AIFM_External_List

error_t MPAI_AIFM_External_List(int* num_in_range, const char** controllers_metadata)

This function returns the number num_in_range of in-range Controllers with which it is possible to establish communication and running the same type of AIW, and a vector controllers_metadata containing AIW Metadata for each reachable Controller specified according to the JSON format defined in Section 6.3. In case more than one AIW of the same type is running on the same remote Controller, each such AIW is presented as a separate vector element.

8.3.9.2       MPAI_AIFM_External_Output_Read

message_t* MPAI_AIFM_External_Output_Read(int controllerID, const char* AIM_name, const char* port_name)

This function attempts to read a message from the External Port identified by (controllerID, AIM_name,port_name). The read is blocking. Hence, in order to avoid deadlocks, the Implementation should first probe the Port with MPAI_AIF_Port_Probe. It returns a copy of the original Message. This function attempts to establish a connection between the Controller and the external in-range Controller identified with a previous call to MPAI_AIFM_Communication_List. The call might fail due to the Controller not being in range anymore or other communication-related issues.

8.3.9.3       MPAI_AIFM_External_Input_Write

error_t MPAI_AIFM_External_Input_Write(int controllerID, const char* AIM_name, const char* port_name, message_t* message)

This function attempts to write a message message to the External Port identified by (controllerID, AIM_name, port_name). The write is blocking. Hence, in order to avoid deadlocks the Implementation should first probe the Port with MPAI_AIF_Port_Probe. The Message being transmitted shall remain available until the function returns, or the behaviour will be undefined. This function attempts to establish a connection between the Controller and the external in-range Controller identified with a previous call to MPAI_AIFM_Communication_List. The call might fail due to the Controller not being in range anymore or other communication-related issues.

 

9          Security API

9.1        Data characterization structure.

These API are intended to support developers who need a secure environment. They are divided into two parts: the first part includes APIs whose calls are executed in the non-secure area and the second part APIs whose calls that are executed in the secure area.

 

Data, independently from its usage (as a key, an encrypted payload, plain text, etc.) will be passed to/from the APIs through data_t structure.

 

The data_t structure shall include the following fields:

  • data_location_t location

the identifier of the location of the data (see data_location_t below).

  • void* data

the pointer (within the location specified above) to the start of the data/

  • size_t size

the size of the data (in bytes).

  • data_flags_t flags

other flags characterizing data.

 

The data_location_t is uint32_t type and can take one of the following symbolic values:

  • DATA_LOC_RAM
  • DATA_LOC_EXT_FLASH
  • DATA_LOC_INT_FLASH
  • DATA_LOC_LOCAL_DISK
  • DATA_LOC_REMOTE_DISK

 

The data_flags_t is uint32_t type and can take one of the following symbolic values:

  • DATA_FLAG_Encrypted
  • DATA_FLAG_plain
  • DATA_FLAG_UNKNOWN

9.2        API called by User Agent

User Agents calls Connect to Controller API

error_t MPAI_AIFU_Controller_Initialize_Secure(bool useAttestation)

This function, called by the User Agent, switches on and initialises the Controller, in particular the Secure Communication Component.

  • Start AIW
  • Suspend
  • Resume
  • Stop

9.3        API to access Secure Storage

In the following stringname is a symbolic name of the security memory area.

9.3.1        User Agent initialises Secure Storage API

Error_t MPAI_AIFSS_Storage_Init(string_t stringname, size_t data_length, const p_data_t data, flags_t flags flags)

Flags specify the initialisation behaviour.

9.3.2        User Agent writes Secure Storage API

Error_t MPAI_AIFSS_Storage_Write(string_t stringname, size_t data_length, const p_data_t data, flags_t flags flags)

Flags specify the write behaviour.

9.3.3        User Agent reads Secure Storage API

Error_t MPAI_AIFSS_Storage_Read(string_t stringname, size_t data_length,const p_data_t data, flags_t flags flags)

Flags specify the read behaviour.

9.3.4        User Agent gets info from Secure Storage API

Error_t MPAI_AIFSS_Storage_Getinfo(string_t stringname, struct storage_info_t * p_info)

9.3.5        User Agent deletes a p_data in Secure Storage API

Error_t MPAI_AIFSS_Storage_Delete(string_t stringname)

We assume that there is a mechanism that takes a stringname of type string and maps to a numeric uid

9.4        API to access Attestation

Controller Trusted Service Attestation call (part of the Trusted Services API)

 

Error_t MPAI_AIFAT_Get_Token(uint8_t *token_buf, size_t token_buf_size,size_t *token_size)

Token Buffer and Token Manage are managed by the code of the API implementation.

Based on CBOR [12], COSE [13] and EAT [14] standards.

9.5        API to access cryptographic functions

9.5.1        Hashing

There are many different hashing algorithms in use today, but some of the most common ones include:

  • SHA (Secure Hash Algorithm) [23]: A family of hash functions developed by the US National Security Agency (NSA). The most widely used members of this family are SHA-1 and SHA-256, both of which are commonly used to generate digital signatures and verify data integrity.
  • MD5 (Message-Digest Algorithm 5) [16]: A widely used hash function that produces 128-bit hash values. Although it is widely used, it is not considered secure and has been replaced by more secure hash functions in many applications.

We plan to start supporting these two algorithms and extend support to more algorithms in future versions of the MPAI-AIF standard.

 

Hash_state_t state object type

Implementation dependent

 

Error_t MPAI_AIFCR_Hash(Hash_state_t * state, algorithm_t alg, const uint8_t * hash, size_t * hash_length, size_t hash_size, const uint8_t * input, size_t input_length)

Perform a hash operation on an input data buffer producing the resulting hash in an output buffer. The encryption engine provides support for encryption/decryption of data of arbitrary size by processing it either in one chunk or multiple chunks. Implementation note: encryption engine should be efficient and release control to the rest of the system on a regular basis (e.g., at the end of a chunk computation).

 

Error_t MPAI_AIFCR_Hash_verify(Hash_state_t * state, const uint8_t * hash, size_t hash_length, const uint8_t * input, size_t input_length)

Perform a hash verification operation checking the hash against an input buffer.

 

Error_t MPAI_AIFCR_Hash_abort(Hash_state_t * state)

abort operation and release internal resources.

9.5.2        Key management

Description

  • applications access keys indirectly via an identifier
  • perform operations using a key without accessing the key material

 

if a key is externally provided it needs to map to the format below.

 

The key data is organised in a data structure that includes identifiers, the data itself, and the type of data as indicated below. The p_data structure includes information regarding the location where the key is stored.

9.5.2.1       MPAI_AIFKM_attributes_t structure

  • identifier (number)
  • p_data (structure)
  • type:
    • RAW_DATA (none)
    • HMAC (hash)
    • DERIVE
    • PASSWORD (key derivation)
    • AES
    • DES
    • RSA (asymmetric RSA cipher)
    • ECC
    • DH (asymmetric DH key exchange).
  • lifetime
    • persistence level
    • volatile keys → lifetime AIF_KEY_LIFETIME_VOLATILE, stored in RAM
    • persistent keys → lifetime AIF_KEY_LIFETIME_PERSISTENT, stored in primary local storage or primary secure element.
  • policy
    • set of usage flags + permitted algorithm
    • permitted algorithms → restrict to a single algorithm, types: NONE or specific algorithm
    • usage flags → EXPORT, COPY, CACHE, ENCRYPT, DECRYPT, SIGN_MESSAGE, VERIFY_MESSAGE, SIGN_HASH, VERIFY_HASH, DERIVE, VERIFY_DERIVATION

 

Error_t MPAI_AIFKM_import_key(const key_attributes_t * attributes,    const uint8_t * data, size_t data_length, key_id_t * key)

When importing a key as a simple binary value, it is the responsibility of the programmer to fill in the attributes data structure. The identifier inside the attributes data structure will be internally generated as a response to the API call.

 

Error_t MPAI_AIFKM_generate_key(const attributes_t * attributes, key_id_t * key)

generate key randomly

 

Error_t MPAI_AIFKM_copy_key(key_id_t source_key, const key_attributes_t * attributes, key_id_t * target_key)

copy key randomly

 

Error_t MPAI_AIFKM_destroy_key(key_id_t key)

destroy key

 

Error_t MPAI_AIFKM_export_key(key_id_t key, uint8_t * data, size_t data_size, size_t * data_length)

export key to output buffer

 

Error_t MPAI_AIFKM_export_public_key(key_id_t key, uint8_t * data,     size_t data_size, size_t * data_length);

export public key to output buffer

9.5.3        Key exchange

algorithms: FFDH (finite-field Diffie-Hellman) [19], ECDH (elliptic curve Diffie-Hellman) [22]

 

Error_t MPAI_AIFKX_raw_key_agreement(algorithm_t alg,key_id_t private_key,const uint8_t * peer_key,size_t peer_key_length,uint8_t * output,size_t output_size,size_t * output_length)

return the raw shared secret

 

Error_t MPAI_AIFKX_key_derivation_key_agreement(key_derivation_operation_t * operation,key_derivation_step_t step,key_id_t private_key,const uint8_t * peer_key,size_t peer_key_length)

key agreement and use the shared secret as input to a key derivation

9.5.4        Message Authentication Code

The code is a cryptographic checksum on data. It uses a session key with the goal to detect any modification of the data. It requires the data and the shared session key known to the data originator and recipients. The cryptographic algorithms of algorithm_t are the same as defined above.

 

mac_state_t

Implementation dependent.

error_t MPAI_AIFMAC_sign_setup(mac_state_t * state, key_id_t key, algorithm_t alg)

setup MAC sign operation

 

error_t MPAI_AIFMAC_verify_setup(mac_state _t * state, key_id_t key, algorithm_t alg)

setup MAC verify operation

 

error_t MPAI_AIFMAC_update(mac_state_t * state, const uint8_t * input, size_t input_length)

compute MAC for a chunk of data (can be repeated several times until completion of data)

 

error_t MPAI_AIFMAC_mac_sign_finish(mac_state_t * state, uint8_t * mac, size_t mac_size, size_t * mac_length)

finish MAC sign operation

 

error_t MPAI_AIFMAC_mac_verify_finish(mac_state_t * state,  const uint8_t * mac, size_t mac_length)

finish MAC verify operation at receiver side

 

error_t MPAI_AIFMAC_mac_abort(mac_state_t * state)

abort MAC operation

9.5.5        Cyphers

Spec will manage 2-3 cyphers, select most relevant ones.

Use a structure of spec enabling easy addition of cyphers, e.g., through enumeration where the elements are initially few, but more are added later.

Some cyphers may be used by default.

Focus also on support of multi-sourced AIMs.

This specification assumes that, in case multiblock cipher is used, the developer shall manage the IV parameter by explicitly generating the IV, i.e., not relying on a service doing that for them, securely communicating the IV to the parties receiving the message and, if the IV is not disposed of, storing the IV in the Secure Storage.

 

algorithms: AIF_ALG_XTS [15], AIF_ALG_ECB_NO_PADDING [24], AIF_ALG_CBC_NO_PADDING [24], AIF_ALG_CBC_PKCS7 [24]

 

In the following API calls, the IV parameter and IV size shall be set to NULL if not needed by the specific call. An IV shall securely generated by the API implementation in case the encryption algorithm needs an IV and NULL is passed to the API.

 

cipher_state_t

state object type (implementation dependent). In future version the state type may be defined.

Error_t MPAI_AIFCIP_Encrypt(cipher_state_t * state, key_id_t key, algorithm_t alg, uint8_t * iv, size_t iv_size, size_t * iv_length)

setup symmetric encryption.

 

Error_t MPAI_AIFCIP_Decrypt(cipher_state_t * state, key_id_t key, algorithm_t alg, uint8_t * iv, size_t iv_size, size_t * iv_length)

setup symmetric decryption.

 

Error_t MPAI_AIFCIP_Abort(cipher_state_t * state)

abort symmetric encryption/decryption.

9.5.6        Authenticated encryption with associated data (AEAD)

algorithms: ALG_GCM [25], ALG_CHACHA20_POLY1305 [18]

PSA_ALG_GCM requires a nonce of at least 1 byte in length

 

aead_state_t

state object type (implementation dependent). In future version the state type may be defined.

 

Error_t MPAI_AIFAEAD_Encrypt(aead_state_t * state, key_id_t key, algorithm_t alg, const uint8_t * nonce, size_t nonce_length, const uint8_t * additional_data, size_t additional_data_length, const uint8_t * plaintext, size_t plaintext_length, uint8_t * ciphertext, size_t ciphertext_size, size_t * ciphertext_length)

 

Error_t MPAI_AIFAEAD_Decrypt(aead_state_t * state, key_id_t key, algorithm_t alg, const uint8_t * nonce, size_t nonce_length, const uint8_t * additional_data, size_t additional_data_length, const uint8_t * ciphertext, size_t ciphertext_length,

uint8_t * plaintext, size_t plaintext_size, size_t * plaintext_length)

 

Error_t MPAI_AIFEAD_Abort(aead_state_t * state)

9.5.7        Signature

algorithms: RSA_PKCS1V15_SIGN [20], RSA_PSS [20], ECDSA [17], PURE_EDDSA [21]

 

sign_state_t

state object type (implementation dependent). In future version the state type may be defined.

 

Error_t MPAI_AIFSIGN_sign_message(sign_state_t * state, key_id_t key, algorithm_t alg, const uint8_t * input, size_t input_length, uint8_t * signature, size_t signature_size, size_t *signature_length)

sign a message with a private key (for hash-and-sign algorithms, this includes the hashing step)

 

Error_t MPAI_AIFSIGN_verify_message(sign_state_t * state, key_id_t key, algorithm_t alg, const uint8_t * input, size_t input_length, const uint8_t * signature, size_t signature_length)

verify a signature with a public key (for hash-and-sign algorithms, this includes the hashing step)

 

psa_status_t psa_sign_hash(psa_key_id_t key, psa_algorithm_t alg, const uint8_t * hash, size_t hash_length, uint8_t * signature, size_t signature_size, size_t * signature_length)

sign an already-calculated hash with a private key

 

psa_status_t psa_verify_hash(psa_key_id_t key, psa_algorithm_t alg, const uint8_t * hash, size_t hash_length, const uint8_t * signature,

size_t signature_length)

verify the signature of a hash

9.5.8        Asymmetric Encryption

algorithms: RSA_PKCS1V15_CRYPT [20], RSA_OAEP [20]

 

psa_status_t psa_asymmetric_encrypt(psa_key_id_t key, psa_algorithm_t alg, const uint8_t * input, size_t input_length, const uint8_t * salt, size_t salt_length, uint8_t * output, size_t output_size, size_t * output_length)

encrypt a short message with a public key

 

psa_status_t psa_asymmetric_decrypt(psa_key_id_t key, psa_algorithm_t alg, const uint8_t * input, size_t input_length, const uint8_t * salt, size_t salt_length, uint8_t * output, size_t output_size, size_t * output_length)

decrypt a short message with a private key

9.6        API to enable secure communication

An implementer should rely on the CoAP and HTTPS support provided by secure transport libraries for the different programming languages.

 

10     Profiles

10.1    Basic Profile

The Basic Profile utilises:

  1. Non Secure Controller.
  2. Non Secure Storage.
  3. Secure Communication enabled by secure communication libraries.
  4. Basic API.

10.2    Secure Profile

Uses all the technologies in this Technical Specification.

11     Examples (Informative)

11.1    AIF Implementations

This Chapter contains informative examples of high-level descriptions of possible AIF operations. This Chapter will continue to be developed in subsequent Version of this Technical Specification by adding more examples.

11.1.1    Resource-constrained implementation

  1. Controller is a single process that implements the AIW and operates based on interrupts call-backs
  2. AIF is instantiated via a secure communication interface
  3. AIMs can be local or has been instantiated through a secure communication interface
  4. Controller initialises the AIF
  5. AIF asks the AIMs to be instantiated
  6. Controller manages the Events and Messages
  7. User Agent can act on the AIWs at the request of the user.

11.1.2    Non-resource-constrained implementation

  1. Controller and AIW are two independent processes
  2. Controller manages the Events and Messages
  3. AIW contacts Controller on Communication and authenticates itself
  4. Controller requests AIW configuration metadata
  5. AIW sends Controller the configuration metadata
  6. The implementation of the AIW can be local or can be downloaded from the Store
  7. Controller authenticates itself with the Store and requests implementations for the needed AIMs listed in the metadata from the Store
  8. The Store sends the requested AIM implementations and the configuration metadata
  9. Controller
    1. Instantiates the AIMs specified in the AIW metadata
    2. Manages their communication and resources by sending Messages to AIMs.
  10. User Agent can gain control of AIWs running on the Controller via a specific Controller API, e.g., User Agent can test conformance of a AIW with an MPAI standard through a dedicated API call.

11.2    Examples of types

byte[] bitstream_t

An array of bytes, with variable length.

{int32 frameNumber; int16 x; int16 y; byte[] frame} frame_t

A struct_type with 4 members named frameNumber, x, y, and frame — they are an int32, an int16, an int16, and an array of bytes with variable length, respectively.

{int32 i32 | int64 i64} variant_t

A variant_type that can be either an int32 or an int64.

11.3    Examples of Metadata

This section contains the AIF, AIW and AIM Metadata of the Enhanced Audioconference Experience Use Case.

11.3.1    Metadata of Enhanced Audioconference Experience AIF

 

{

“$schema”: “https://json-schema.org/draft/2020-12/schema”,

“$id”: “https://mpai.community/standards/resources/MPAI-AIF/V1/AIF-metadata.schema.json”,

“title”: “MPAI-AIF V1 AIF metadata”,

“ImplementerID”: “/* String assigned by IIDRA */”,

“Version”: “v0.1”,

“APIProfile”: “Main”,

“ResourcePolicies”: [

{

“Name”: “Memory”,

“Minimum”: “50000”,

“Maximum”: “100000”,

“Request”: “75000”

},

{

“Name”: “CPUNumber”,

“Minimum”: “1”,

“Maximum”: “2”,

“Request”: “1”

},

{

“Name”: “CPU:Class”,

“Minimum”: “Low”,

“Maximum”: “High”,

“Request”: “Medium”

},

{

“Name”: “GPU:CUDA:FrameBuffer”,

“Minimum”: “11GB_GDDR5X”,

“Maximum”: “8GB_GDDR6X”,

“Request”: “11GB_GDDR6”

},

{

“Name”: “GPU:CUDA:MemorySpeed”,

“Minimum”: “1.60GHz”,

“Maximum”: “1.77GHz”,

“Request”: “1.71GHz”

},

{

“Name”: “GPU:CUDA:Class”,

“Minimum”: “SM61”,

“Maximum”: “SM86”,

“Request”: “SM75”

},

{

“Name”: “GPU:Number”,

“Minimum”: “1”,

“Maximum”: “1”,

“Request”: “1”

}

],

“Authentication”: “admin”,

“TimeBase”: “NTP”

}

11.3.2    Metadata of Enhanced Audioconference Experience AIW

 

{

“$schema”: “https://json-schema.org/draft/2020-12/schema”,

“$id”: “https://mpai.community/standards/resources/MPAI-AIF/V1/AIW-AIM-metadata.schema.json”,

“title”: “EAE AIF v1 AIW/AIM metadata”,

“Identifier”: {

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”: {

“Standard”: “MPAI-CAE”,

“AIW”: “CAE-EAE”,

“AIM”: “CAE-EAE”,

“Version”: “1”

}

},

“APIProfile”: “Main”,

“Description”: “This AIF is used to call the AIW of EAE”,

“Types”: [

{

“Name”:”Audio_t”,

“Type”:”uint16[]”,

},

{

“Name”:”Array_Audio_t”,

“Type”:”Audio_t[]”,

},

{

“Name”:”TransformArray_Audio_t”,

“Type”:”Array_Audio_t[]”,

},

{

“Name”:”Text_t”,

“Type”:”uint8[]”,

}

],

“Ports”: [

{

“Name”:”MicrophoneArrayAudio”,

“Direction”:”InputOutput”,

“RecordType”:”Array_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”TransformMultichannelAudio”,

“Direction”:”OutputInput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”TransformMultichannelAudio”,

“Direction”:”InputOutput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”MicrophoneArrayGeometry”,

“Direction”:”InputOutput”,

“RecordType”:”Text_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”SphericalHarmonicsDecomposition”,

“Direction”:”OutputInput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”SphericalHarmonicsDecomposition”,

“Direction”:”InputOutput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”TransformSpeech”,

“Direction”:”OutputInput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”AudioSceneGeometry”,

“Direction”:”OutputInput”,

“RecordType”:”Text_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”SphericalHarmonicsDecomposition”,

“Direction”:”InputOutput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”TransformSpeech”,

“Direction”:”InputOutput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”AudioSceneGeometry”,

“Direction”:”InputOutput”,

“RecordType”:”Text_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”DenoisedTransformSpeech”,

“Direction”:”OutputInput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”DenoisedTransformSpeech”,

“Direction”:”InputOutput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”DenoisedSpeech”,

“Direction”:”OutputInput”,

“RecordType”:”Array_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

}

],

“SubAIMs”: [

{

“Name”: “AnalysisTransform”,

“Identifier”: {

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”: {

“Standard”: “MPAI-CAE”,

“AIW”: “CAE-EAE”,

“AIM”: “AnalysisTransform”,

“Version”: “1”

}

}

},

{

“Name”: “SoundFieldDescription”,

“Identifier”: {

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”: {

“Standard”: “MPAI-CAE”,

“AIW”: “CAE-EAE”,

“AIM”: “SoundFieldDescription”,

“Version”: “1”

}

}

},

{

“Name”: “SpeechDetectionandSeparation”,

“Identifier”: {

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”: {

“Standard”: “MPAI-CAE”,

“AIW”: “CAE-EAE”,

“AIM”: “SpeechDetectionandSeparation”,

“Version”: “1”

}

}

},

{

“Name”: “NoiseCancellation”,

“Identifier”: {

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”: {

“Standard”: “MPAI-CAE”,

“AIW”: “CAE-EAE”,

“AIM”: “NoiseCancellation”,

“Version”: “1”

}

}

},

{

“Name”: “SynthesisTransform”,

“Identifier”: {

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”: {

“Standard”: “MPAI-CAE”,

“AIW”: “CAE-EAE”,

“AIM”: “SynthesisTransform”,

“Version”: “1”

}

}

},

{

“Name”: “Packager”,

“Identifier”: {

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”: {

“Standard”: “MPAI-CAE”,

“AIW”: “CAE-EAE”,

“AIM”: “Packager”,

“Version”: “1”

}

}

}

],

“Topology”: [

{

“Output”:{

“AIMName”:””,

“PortName”:”MicrophoneArrayAudio”

},

“Input”:{

“AIMName”:”AnalysisTransform”,

“PortName”:”MicrophoneArrayAudio”

}

},

{

“Output”:{

“AIMName”:””,

“PortName”:”MicrophoneArrayGeometry_1″

},

“Input”:{

“AIMName”:”SoundFieldDescription”,

“PortName”:” MicrophoneArrayGeometry_1″

}

},

{

“Output”:{

“AIMName”:””,

“PortName”:”MicrophoneArrayGeometry_2″

},

“Input”:{

“AIMName”:”Packager”,

“PortName”:” MicrophoneArrayGeometry_2″

}

},

{

“Output”:{

“AIMName”:”AnalysisTransform”,

“PortName”:”TransformMultiChannelAudio”

},

“Input”:{

“AIMName”:”SoundFieldDescription”,

“PortName”:”TransformMultiChannelAudio”

}

},

{

“Output”:{

“AIMName”:”SoundFieldDescription”,

“PortName”:”SphericalHarmonicsDecomposition_1″

},

“Input”:{

“AIMName”:”SpeechDetectionandSeparation”,

“PortName”:”SphericalHarmonicsDecomposition_1″

}

},

{

“Output”:{

“AIMName”:”SoundFieldDescription”,

“PortName”:”SphericalHarmonicsDecomposition_2″

},

“Input”:{

“AIMName”:”SpeechDetectionandSeparation”,

“PortName”:”SphericalHarmonicsDecomposition_2″

}

},

{

“Output”:{

“AIMName”:”SpeechDetectionandSeparation”,

“PortName”:”TransformSpeech”

},

“Input”:{

“AIMName”:”NoiseCancellation”,

“PortName”:”TransformSpeech”

}

},

{

“Output”:{

“AIMName”:”SpeechDetectionandSeparation”,

“PortName”:”AudioSceneGeometry_1″

},

“Input”:{

“AIMName”:”NoiseCancellation”,

“PortName”:”AudioSceneGeometry_1″

}

},

{

“Output”:{

“AIMName”:”SpeechDetectionandSeparation”,

“PortName”:”AudioSceneGeometry_2″

},

“Input”:{

“AIMName”:”Packager”,

“PortName”:”AudioSceneGeometry_2″

}

},

{

“Output”:{

“AIMName”:”NoiseCancellation”,

“PortName”:”DenoisedTransformSpeech”

},

“Input”:{

“AIMName”:”SynthesisTransform”,

“PortName”:”DenoisedTransformSpeech”

}

},

{

“Output”:{

“AIMName”:”SynthesisTransform”,

“PortName”:”DenoisedSpeech”

},

“Input”:{

“AIMName”:”Packager”,

“PortName”:”DenoisedSpeech”

}

}

],

“Implementations”: [

{

“BinaryName”: “eae.exe”,

“Architecture”: “x64”,

“OperatingSystem”: “Windows”,

“Version”: “v0.1”,

“Source”: “AIMStorage”,

“Destination”: “”

}

],

“ResourcePolicies”: [

{

“Name”: “Memory”,

“Minimum”: “50000”,

“Maximum”: “100000”,

“Request”: “75000”

},

{

“Name”: “CPUNumber”,

“Minimum”: “1”,

“Maximum”: “2”,

“Request”: “1”

},

{

“Name”: “CPU:Class”,

“Minimum”: “Low”,

“Maximum”: “High”,

“Request”: “Medium”

},

{

“Name”: “GPU:CUDA:FrameBuffer”,

“Minimum”: “11GB_GDDR5X”,

“Maximum”: “8GB_GDDR6X”,

“Request”: “11GB_GDDR6”

},

{

“Name”: “GPU:CUDA:MemorySpeed”,

“Minimum”: “1.60GHz”,

“Maximum”: “1.77GHz”,

“Request”: “1.71GHz”

},

{

“Name”: “GPU:CUDA:Class”,

“Minimum”: “SM61”,

“Maximum”: “SM86”,

“Request”: “SM75”

},

{

“Name”: “GPU:Number”,

“Minimum”: “1”,

“Maximum”: “1”,

“Request”: “1”

}

],

“Documentation”:[

{

“Type”:”Tutorial”,

“URI”:”https://mpai.community/standards/mpai-cae/”

}

]

}

11.3.3    Metadata of CAE-EAE Analysis Transform AIM

 

{

“Identifier”: {

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”: {

“Name”: “CAE”,

“AIW”: “EAE”,

“AIM”: “AnalysisTransform”,

“Version”:”1″

}

},

“Description”:”This AIM implements analysis transform function for CAE-EAE that converts microphone array audio into transform multichannel audio.”,

“Types”:[

{

“Name”: “Audio_t”,

“Type”: “uint16[]”

},

{

“Name”: “Array_Audio_t”,

“Type”: “Audio_t[]”

},

{

“Name”: “Transform_Array_Audio_t”,

“Type”: “Array_Audio_t[]”

}

],

“Ports”:[

{

“Name”:”MicrophoneArrayAudio”,

“Direction”:”InputOutput”,

“RecordType”:”Array_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”TransformMultichannelAudio”,

“Direction”:”OutputInput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

}

],

“SubAIMs”:[],

“Topology”:[],

“Implementations”: [],

“Documentation”:[

{

“Type”:”Tutorial”,

“URI”:”https://mpai.community/standards/mpai-cae/”

}

]

}

11.3.4    Metadata of Enhanced Audioconference Experience AIW

 

{

“$schema”: “https://json-schema.org/draft/2020-12/schema”,

“$id”: “https://mpai.community/standards/resources/MPAI-AIF/V1/AIW-AIM-metadata.schema.json”,

“title”: “EAE AIF v1 AIW/AIM metadata”,

“Identifier”: {

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”: {

“Standard”: “MPAI-CAE”,

“AIW”: “CAE-EAE”,

“AIM”: “CAE-EAE”,

“Version”: “1”

}

},

“APIProfile”: “Main”,

“Description”: “This AIF is used to call the AIW of EAE”,

“Types”: [

{

“Name”:”Audio_t”,

“Type”:”uint16[]”,

},

{

“Name”:”Array_Audio_t”,

“Type”:”Audio_t[]”,

},

{

“Name”:”TransformArray_Audio_t”,

“Type”:”Array_Audio_t[]”,

},

{

“Name”:”Text_t”,

“Type”:”uint8[]”,

}

],

“Ports”: [

{

“Name”:”MicrophoneArrayAudio”,

“Direction”:”InputOutput”,

“RecordType”:”Array_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”TransformMultichannelAudio”,

“Direction”:”OutputInput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”TransformMultichannelAudio”,

“Direction”:”InputOutput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”MicrophoneArrayGeometry”,

“Direction”:”InputOutput”,

“RecordType”:”Text_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”SphericalHarmonicsDecomposition”,

“Direction”:”OutputInput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”SphericalHarmonicsDecomposition”,

“Direction”:”InputOutput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”TransformSpeech”,

“Direction”:”OutputInput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”AudioSceneGeometry”,

“Direction”:”OutputInput”,

“RecordType”:”Text_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”SphericalHarmonicsDecomposition”,

“Direction”:”InputOutput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”TransformSpeech”,

“Direction”:”InputOutput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”AudioSceneGeometry”,

“Direction”:”InputOutput”,

“RecordType”:”Text_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”DenoisedTransformSpeech”,

“Direction”:”OutputInput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”DenoisedTransformSpeech”,

“Direction”:”InputOutput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”DenoisedSpeech”,

“Direction”:”OutputInput”,

“RecordType”:”Array_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

}

],

“SubAIMs”: [

{

“Name”: “AnalysisTransform”,

“Identifier”: {

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”: {

“Standard”: “MPAI-CAE”,

“AIW”: “CAE-EAE”,

“AIM”: “AnalysisTransform”,

“Version”: “1”

}

}

},

{

“Name”: “SoundFieldDescription”,

“Identifier”: {

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”: {

“Standard”: “MPAI-CAE”,

“AIW”: “CAE-EAE”,

“AIM”: “SoundFieldDescription”,

“Version”: “1”

}

}

},

{

“Name”: “SpeechDetectionandSeparation”,

“Identifier”: {

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”: {

“Standard”: “MPAI-CAE”,

“AIW”: “CAE-EAE”,

“AIM”: “SpeechDetectionandSeparation”,

“Version”: “1”

}

}

},

{

“Name”: “NoiseCancellation”,

“Identifier”: {

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”: {

“Standard”: “MPAI-CAE”,

“AIW”: “CAE-EAE”,

“AIM”: “NoiseCancellation”,

“Version”: “1”

}

}

},

{

“Name”: “SynthesisTransform”,

“Identifier”: {

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”: {

“Standard”: “MPAI-CAE”,

“AIW”: “CAE-EAE”,

“AIM”: “SynthesisTransform”,

“Version”: “1”

}

}

},

{

“Name”: “Packager”,

“Identifier”: {

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”: {

“Standard”: “MPAI-CAE”,

“AIW”: “CAE-EAE”,

“AIM”: “Packager”,

“Version”: “1”

}

}

}

],

“Topology”: [

{

“Output”:{

“AIMName”:””,

“PortName”:”MicrophoneArrayAudio”

},

“Input”:{

“AIMName”:”AnalysisTransform”,

“PortName”:”MicrophoneArrayAudio”

}

},

{

“Output”:{

“AIMName”:””,

“PortName”:”MicrophoneArrayGeometry_1″

},

“Input”:{

“AIMName”:”SoundFieldDescription”,

“PortName”:” MicrophoneArrayGeometry_1″

}

},

{

“Output”:{

“AIMName”:””,

“PortName”:”MicrophoneArrayGeometry_2″

},

“Input”:{

“AIMName”:”Packager”,

“PortName”:” MicrophoneArrayGeometry_2″

}

},

{

“Output”:{

“AIMName”:”AnalysisTransform”,

“PortName”:”TransformMultiChannelAudio”

},

“Input”:{

“AIMName”:”SoundFieldDescription”,

“PortName”:”TransformMultiChannelAudio”

}

},

{

“Output”:{

“AIMName”:”SoundFieldDescription”,

“PortName”:”SphericalHarmonicsDecomposition_1″

},

“Input”:{

“AIMName”:”SpeechDetectionandSeparation”,

“PortName”:”SphericalHarmonicsDecomposition_1″

}

},

{

“Output”:{

“AIMName”:”SoundFieldDescription”,

“PortName”:”SphericalHarmonicsDecomposition_2″

},

“Input”:{

“AIMName”:”SpeechDetectionandSeparation”,

“PortName”:”SphericalHarmonicsDecomposition_2″

}

},

{

“Output”:{

“AIMName”:”SpeechDetectionandSeparation”,

“PortName”:”TransformSpeech”

},

“Input”:{

“AIMName”:”NoiseCancellation”,

“PortName”:”TransformSpeech”

}

},

{

“Output”:{

“AIMName”:”SpeechDetectionandSeparation”,

“PortName”:”AudioSceneGeometry_1″

},

“Input”:{

“AIMName”:”NoiseCancellation”,

“PortName”:”AudioSceneGeometry_1″

}

},

{

“Output”:{

“AIMName”:”SpeechDetectionandSeparation”,

“PortName”:”AudioSceneGeometry_2″

},

“Input”:{

“AIMName”:”Packager”,

“PortName”:”AudioSceneGeometry_2″

}

},

{

“Output”:{

“AIMName”:”NoiseCancellation”,

“PortName”:”DenoisedTransformSpeech”

},

“Input”:{

“AIMName”:”SynthesisTransform”,

“PortName”:”DenoisedTransformSpeech”

}

},

{

“Output”:{

“AIMName”:”SynthesisTransform”,

“PortName”:”DenoisedSpeech”

},

“Input”:{

“AIMName”:”Packager”,

“PortName”:”DenoisedSpeech”

}

}

],

“Implementations”: [{

“BinaryName”: “eae.exe”,

“Architecture”: “x64”,

“OperatingSystem”: “Windows”,

“Version”: “v0.1”,

“Source”: “AIMStorage”,

“Destination”: “”

}

],

“ResourcePolicies”: [

{

“Name”: “Memory”,

“Minimum”: “50000”,

“Maximum”: “100000”,

“Request”: “75000”

},

{

“Name”: “CPUNumber”,

“Minimum”: “1”,

“Maximum”: “2”,

“Request”: “1”

},

{

“Name”: “CPU:Class”,

“Minimum”: “Low”,

“Maximum”: “High”,

“Request”: “Medium”

},

{

“Name”: “GPU:CUDA:FrameBuffer”,

“Minimum”: “11GB_GDDR5X”,

“Maximum”: “8GB_GDDR6X”,

“Request”: “11GB_GDDR6”

},

{

“Name”: “GPU:CUDA:MemorySpeed”,

“Minimum”: “1.60GHz”,

“Maximum”: “1.77GHz”,

“Request”: “1.71GHz”

},

{

“Name”: “GPU:CUDA:Class”,

“Minimum”: “SM61”,

“Maximum”: “SM86”,

“Request”: “SM75”

},

{

“Name”: “GPU:Number”,

“Minimum”: “1”,

“Maximum”: “1”,

“Request”: “1”

}

],

“Documentation”:[

{

“Type”:”Tutorial”,

“URI”:”https://mpai.community/standards/mpai-cae/”

}

]

}

11.3.5    Metadata of CAE-EAE Analysis Transform AIM

 

{

“Identifier”:{

“ImplementerID”: “/* String assigned by IIDRA */”,

“Specification”:{

“Name”: “CAE”,

“AIW”: “EAE”,

“AIM”: “AnalysisTransform”,

“Version”:”1″

}

},

“Description”:”This AIM implements analysis transform function for CAE-EAE that converts microphone array audio into transform multichannel audio.”,

“Types”:[

{

“Name”: “Audio_t”,

“Type”: “uint16[]”

},

{

“Name”: “Array_Audio_t”,

“Type”: “Audio_t[]”

},

{

“Name”: “Transform_Array_Audio_t”,

“Type”: “Array_Audio_t[]”

}

],

“Ports”:[

{

“Name”:”MicrophoneArrayAudio”,

“Direction”:”InputOutput”,

“RecordType”:”Array_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”TransformMultichannelAudio”,

“Direction”:”OutputInput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

}

],

“SubAIMs”:[],

“Topology”:[],

“Implementations”: [],

“Documentation”:[

{

“Type”:”Tutorial”,

“URI”:”https://mpai.community/standards/mpai-cae/”

}

]

}

 

11.3.6    Metadata of CAE-EAE Sound Field Description AIM

 

{

“AIM”:{

“ImplementerID”: “/* String assigned by IIDRA */”,

“Standard”:{

“Name”: “CAE”,

“AIW”: “EAE”,

“AIM”: “SoundFieldDescription”,

“Version”:”1″

},

“Description”:”This AIM implements sound field description function for CAE-EAE that converts transform multichannel audio into spherical harmonics decomposition.”,

“Types”:[

{

“Name”: “Text_t”,

“Type”: “uint8[]”

},

{

“Name”: “Audio_t”,

“Type”: “uint16[]”

},

{

“Name”: “Array_Audio_t”,

“Type”: “Audio_t[]”

},

{

“Name”: “Transform_Array_Audio_t”,

“Type”: “Array_Audio_t[]”

}

],

“Ports”:[

{

“Name”:”TransformMultichannelAudio”,

“Direction”:”InputOutput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”MicrophoneArrayGeometry”,

“Direction”:”InputOutput”,

“RecordType”:”Text_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”SphericalHarmonicsDecomposition”,

“Direction”:”OutputInput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

}

],

“SubAIMs”:[],

“Topology”:[],

“Documentation”:[

{

“Type”:”tutorial”,

“URI”:”https://mpai.community/standards/mpai-cae/”

}

]

}

}

 

11.3.7    Metadata of CAE-EAE Speech Detection and Separation AIM

 

{

“AIM”:{

“ImplementerID”: “/* String assigned by IIDRA */”,

“Standard”:{

“Name”: “CAE”,

“AIW”: “EAE”,

“AIM”: “SpeechDetectionandSeparation”,

“Version”:”1″

},

“Description”:”This AIM implements speech detection and separation function for CAE-EAE that converts spherical harmonics coefficients into transform speech and Audio Scene Geometry.”,

“Types”:[

{

“Name”: “Text_t”,

“Type”: “uint8[]”

},

{

“Name”: “Audio_t”,

“Type”: “uint16[]”

},

{

“Name”: “Array_Audio_t”,

“Type”: “Audio_t[]”

},

{

“Name”: “Transform_Array_Audio_t”,

“Type”: “Array_Audio_t[]”

}

],

“Ports”:[

{

“Name”:”SphericalHarmonicsDecomposition”,

“Direction”:”InputOutput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

 

},

{

“Name”:”TransformSpeech”,

“Direction”:”OutputInput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

 

},

{

“Name”:”AudioSceneGeometry”,

“Direction”:”OutputInput”,

“RecordType”:”Text_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

 

}

],

“AIMs”:[],

“Topology”:[],

“Documentation”:[

{

“Type”:”tutorial”,

“URI”:”https://mpai.community/standards/mpai-cae/”

}

]

}

}

 

11.3.8    Metadata of CAE-EAE Noise Cancellation AIM

{

“AIM”:{

“ImplementerID”: “/* String assigned by IIDRA */”,

“Standard”:{

“Name”: “CAE”,

“AIW”: “EAE”,

“AIM”: “NoiseCancellation”,

“Version”:”1″

},

“Description”:”This AIM implements noise cancellation function for CAE-EAE that converts transform speech into denoised transform speech.”,

“Types”:[

{

“Name”: “Text_t”,

“Type”: “uint8[]”

},

{

“Name”: “Audio_t”,

“Type”: “uint16[]”

},

{

“Name”: “Array_Audio_t”,

“Type”: “Audio_t[]”

},

{

“Name”: “Transform_Array_Audio_t”,

“Type”: “Array_Audio_t[]”

}

],

“Ports”:[

{

“Name”:”SphericalHarmonicsDecomposition”,

“Direction”:”InputOutput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

 

},

{

“Name”:”TransformSpeech”,

“Direction”:”InputOutput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

 

},

{

“Name”:”AudioSceneGeometry”,

“Direction”:”InputOutput”,

“RecordType”:”Text_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

 

},

{

“Name”:”DenoisedTransformSpeech”,

“Direction”:”OutputInput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

 

}

 

],

“AIMs”:[

 

],

“Topology”:[

 

],

 

“Documentation”:[

{

“Type”:”tutorial”,

“URI”:”https://mpai.community/standards/mpai-cae/”

}

]

}

}

 

11.3.9    Metadata of CAE-EAE Synthesis Transform AIM

 

{

“AIM”:{

“ImplementerID”: “/* String assigned by IIDRA */”,

“Standard”:{

“Name”: “CAE”,

“AIW”: “EAE”,

“AIM”: “SynthesisTransform”,

“Version”:”1″

},

“Description”:”This AIM implements synthesis transform function for CAE-EAE that converts denoised transform speech into denoised speech.”,

“Types”:[

{

“Name”: “Audio_t”,

“Type”: “uint16[]”

},

{

“Name”: “Array_Audio_t”,

“Type”: “Audio_t[]”

},

{

“Name”: “Transform_Array_Audio_t”,

“Type”: “Array_Audio_t[]”

}

 

],

 

“Ports”:[

{

“Name”:”DenoisedTransformSpeech”,

“Direction”:”InputOutput”,

“RecordType”:”TransformArray_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

},

{

“Name”:”DenoisedSpeech”,

“Direction”:”OutputInput”,

“RecordType”:”Array_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

}

 

],

“AIMs”:[

 

],

“Topology”:[

 

],

 

“Documentation”:[

{

“Type”:”tutorial”,

“URI”:”https://mpai.community/standards/mpai-cae/”

}

]

}

}

 

11.3.10Metadata of CAE-EAE Packager AIM

 

{

“AIM”:{

“ImplementerID”: “/* String assigned by IIDRA */”

“Standard”:{

“Name”: “CAE”,

“AIW”: “EAE”,

“AIM”: “Packager”,

“Version”:”1″

},

“Description”:”This AIM implements packager function for CAE-EAE that converts denoised speech into Multichannel Audio + Audio Scene Geometry.”,

“Types”:[

{

“Name”: “Text_t”,

“Type”: “uint8[]”

},

{

“Name”: “Audio_t”,

“Type”: “uint16[]”

},

{

“Name”: “Array_Audio_t”,

“Type”: “Audio_t[]”

}

],

“Ports”:[

{

“Name”:”DenoisedSpeech”,

“Direction”:”InputOutput”,

“RecordType”:”Array_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

 

},

{

“Name”:”AudioSceneGeometry”,

“Direction”:”InputOutput”,

“RecordType”:”Text_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

 

},

{

“Name”:”MultichannelAudioandAudioSceneGeometry”,

“Direction”:”OutputInput”,

“RecordType”:”Array_Audio_t”,

“Technology”:”Software”,

“Protocol”:””,

“IsRemote”: false

}

 

],

“AIMs”:[],

“Topology”:[ ],

“Documentation”:[

{

“Type”:”tutorial”,

“URI”:”https://mpai.community/standards/mpai-cae/”

}

]

}

}

  • Annex 1 MPAI-wide terms and definitions

The Terms used in this standard whose first letter is capital and are not already included in Table 1 are defined in Table 2.

 

Table 2MPAI-wide Terms

 

Term Definition
Access Static or slowly changing data that are required by an application such as domain knowledge data, data models, etc.
AI Framework (AIF) The environment where AIWs are executed.
AI Module (AIM) A data processing element receiving AIM-specific Inputs and producing AIM-specific Outputs according to according to its Function. An AIM may be an aggregation of AIMs.
AI Workflow (AIW) A structured aggregation of AIMs implementing a Use Case receiving AIW-specific inputs and producing AIW-specific outputs according to the AIW Function.
Application Standard An MPAI Standard designed to enable a particular application domain.
Channel A connection between an output port of an AIM and an input port of an AIM. The term “connection” is also used as synonymous.
Communication The infrastructure that implements message passing between AIMs
Component One of the 7 AIF elements: Access, Communication, Controller, Internal Storage, Global Storage, Store, and User Agent
Conformance The attribute of an Implementation of being a correct technical Implem­entation of a Technical Specification.
Conformance Tester An entity Testing the Conformance of an Implem­entation.
Conformance Testing The normative document specifying the Means to Test the Conformance of an Implem­entation.
Conformance Testing Means Procedures, tools, data sets and/or data set characteristics to Test the Conformance of an Implem­en­tation.
Connection A channel connecting an output port of an AIM and an input port of an AIM.
Controller A Component that manages and controls the AIMs in the AIF, so that they execute in the correct order and at the time when they are needed
Data Format The standard digital representation of data.
Data Semantics The meaning of data.
Ecosystem The ensemble of actors making it possible for a User to execute an application composed of an AIF, one or more AIWs, each with one or more AIMs potentially sourced from independent implementers.
Explainability The ability to trace the output of an Implementation back to the inputs that have produced it.
Fairness The attribute of an Implementation whose extent of applicability can be assessed by making the training set and/or network open to testing for bias and unanticipated results.
Function The operations effected by an AIW or an AIM on input data.
Global Storage A Component to store data shared by AIMs.
Internal Storage A Component to store data of the individual AIMs.
Identifier A name that uniquely identifies an Implementation.
Implementation 1.      An embodiment of the MPAI-AIF Technical Specification, or

2.      An AIW or AIM of a particular Level (1-2-3) conforming with a Use Case of an MPAI Applic­ation Standard.

Implementer A legal entity implementing MPAI Technical Specifications.
ImplementerID (IID) A unique name assigned by the ImplementerID Registration Authority to an Implementer.
ImplementerID Registration Authority (IIDRA) The entity appointed by MPAI to assign ImplementerID’s to Implementers.
Interoperability The ability to functionally replace an AIM with another AIW having the same Interoperability Level
Interoperability Level The attribute of an AIW and its AIMs to be executable in an AIF Implem­en­tati­on and to:

1.      Be proprietary (Level 1)

2.      Pass the Conformance Tes­ting (Level 2) of an Applic­ation Standard

3.      Pass the Perform­ance Testing (Level 3) of an Applic­ation Standard.

Knowledge Base Structured and/or unstructured information made accessible to AIMs via MPAI-specified interfaces
Message A sequence of Records transported by Communication through Channels.
Normativity The set of attributes of a technology or a set of technologies specified by the applicable parts of an MPAI standard.
Performance The attribute of an Implementation of being Reliable, Robust, Fair and Replicable.
Performance Assessment The normative document specifying the Means to Assess the Grade of Performance of an Implementation.
Performance Assessment Means Procedures, tools, data sets and/or data set characteristics to Assess the Performance of an Implem­en­tation.
Performance Assessor An entity Assessing the Performance of an Implementation.
Profile A particular subset of the technologies used in MPAI-AIF or an AIW of an Application Standard and, where applicable, the classes, other subsets, options and parameters relevant to that subset.
Record A data structure with a specified structure
Reference Model The AIMs and theirs Connections in an AIW.
Reference Software A technically correct software implementation of a Technical Specific­ation containing source code, or source and compiled code.
Reliability The attribute of an Implementation that performs as specified by the Application Standard, profile and version the Implementation refers to, e.g., within the application scope, stated limitations, and for the period of time specified by the Implementer.
Replicability The attribute of an Implementation whose Performance, as Assessed by a Performance Assessor, can be replicated, within an agreed level, by another Performance Assessor.
Robustness The attribute of an Implementation that copes with data outside of the stated application scope with an estimated degree of confidence.
Scope The domain of applicability of an MPAI Application Standard
Service Provider An entrepreneur who offers an Implementation as a service (e.g., a recommendation service) to Users.
Standard The ensemble of Technical Specification, Reference Software, Confor­man­ce Testing and Performance Assessment of an MPAI application Standard.
Technical Specification (Framework) the normative specification of the AIF.

(Application) the normative specification of the set of AIWs belon­ging to an application domain along with the AIMs required to Im­plem­ent the AIWs that includes:

1.      The formats of the Input/Output data of the AIWs implementing the AIWs.

2.      The Connections of the AIMs of the AIW.

3.      The formats of the Input/Output data of the AIMs belonging to the AIW.

Testing Laboratory A laboratory accredited to Assess the Grade of  Performance of Implementations.
Time Base The protocol specifying how Components can access timing information
Topology The set of AIM Connections of an AIW.
Use Case A particular instance of the Application domain target of an Application Standard.
User A user of an Implementation.
User Agent The Component interfacing the user with an AIF through the Controller.
Version A revision or extension of a Standard or of one of its elements.
Zero Trust A model of cybersecurity primarily focused on data and service protection that assumes no implicit trust.

 

 

 

 

  • Annex 2 Notices and Disclaimers Concerning MPAI Standards (Informative)

The notices and legal disclaimers given below shall be borne in mind when downloading and using approved MPAI Standards.

 

In the following, “Standard” means the collection of four MPAI-approved and published documents: “Technical Specification”, “Reference Software” and “Conformance Testing” and, where applicable, “Performance Testing”.

 

Life cycle of MPAI Standards

MPAI Standards are developed in accordance with the MPAI Statutes. An MPAI Standard may only be developed when a Framework Licence has been adopted. MPAI Standards are developed by especially established MPAI Development Committees who operate on the basis of consensus, as specified in Annex 1 of the MPAI Statutes. While the MPAI General Assembly and the Board of Directors administer the process of the said Annex 1, MPAI does not independently evaluate, test, or verify the accuracy of any of the information or the suitability of any of the technology choices made in its Standards.

 

MPAI Standards may be modified at any time by corrigenda or new editions. A new edition, however, may not necessarily replace an existing MPAI standard. Visit the web page to determine the status of any given published MPAI Standard.

 

Comments on MPAI Standards are welcome from any interested parties, whether MPAI members or not. Comments shall mandatorily include the name and the version of the MPAI Standard and, if applicable, the specific page or line the comment applies to. Comments should be sent to the MPAI Secretariat. Comments will be reviewed by the appropriate committee for their technical relevance. However, MPAI does not provide interpretation, consulting information, or advice on MPAI Standards. Interested parties are invited to join MPAI so that they can attend the relevant Development Committees.

 

Coverage and Applicability of MPAI Standards

MPAI makes no warranties or representations of any kind concerning its Standards, and expressly disclaims all warranties, expressed or implied, concerning any of its Standards, including but not limited to the warranties of merchantability, fitness for a particular purpose, non-infringement etc. MPAI Standards are supplied “AS IS”.

 

The existence of an MPAI Standard does not imply that there are no other ways to produce and distribute products and services in the scope of the Standard. Technical progress may render the technologies included in the MPAI Standard obsolete by the time the Standard is used, especially in a field as dynamic as AI. Therefore, those looking for standards in the Data Compression by Artificial Intelligence area should carefully assess the suitability of MPAI Standards for their needs.

 

IN NO EVENT SHALL MPAI BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO: THE NEED TO PROCURE SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE PUBLICATION, USE OF, OR RELIANCE UPON ANY STANDARD, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE AND REGARDLESS OF WHETHER SUCH DAMAGE WAS FORESEEABLE.

 

MPAI alerts users that practicing its Standards may infringe patents and other rights of third parties. Submitters of technologies to this standard have agreed to licence their Intellectual Property according to their respective Framework Licences.

 

Users of MPAI Standards should consider all applicable laws and regulations when using an MPAI Standard. The validity of Conformance Testing is strictly technical and refers to the correct implementation of the MPAI Standard. Moreover, positive Performance Assessment of an implementation applies exclusively in the context of the MPAI Governance and does not imply compliance with any regulatory requirements in the context of any jurisdiction. Therefore, it is the responsibility of the MPAI Standard implementer to observe or refer to the applicable regulatory requirements. By publishing an MPAI Standard, MPAI does not intend to promote actions that are not in compliance with applicable laws, and the Standard shall not be construed as doing so. In particular, users should evaluate MPAI Standards from the viewpoint of data privacy and data ownership in the context of their jurisdictions.

 

Implementers and users of MPAI Standards documents are responsible for determining and complying with all appropriate safety, security, environmental and health and all applicable laws and regulations.

 

Copyright

MPAI draft and approved standards, whether they are in the form of documents or as web pages or otherwise, are copyrighted by MPAI under Swiss and international copyright laws. MPAI Standards are made available and may be used for a wide variety of public and private uses, e.g., implementation, use and reference, in laws and regulations and standardisation. By making these documents available for these and other uses, however, MPAI does not waive any rights in copyright to its Standards. For inquiries regarding the copyright of MPAI standards, please contact the MPAI Secretariat.

 

The Reference Software of an MPAI Standard is released with the MPAI Modified Berkeley Software Distribution licence. However, implementers should be aware that the Reference Software of an MPAI Standard may reference some third-party software that may have a different licence.

 

 

  • Annex 3 The Governance of the MPAI Ecosystem (Informative)

Level 1 Interoperability

With reference to Figure 1, MPAI issues and maintains a Technical Specification – called MPAI-AIF – whose components are:

  1. An environment called AI Framework (AIF) running AI Workflows (AIW) composed of inter­connected AI Modules (AIM) exposing standard interfaces.
  2. A distribution system of AIW and AIM Implementation called MPAI Store from which an AIF Implementation can download AIWs and AIMs.

A Level 1 Implementation shall be an Implementation of the MPAI-AIF Technical Specification executing AIWs composed of AIMs able to call the MPAI-AIF APIs.

 

Implementers’ benefits Upload to the MPAI Store and have globally distributed Implementations of

–          AIFs conforming to MPAI-AIF.

–          AIWs and AIMs performing prop­rietary functions executable in AIF.

Users’ benefits Rely on Implementations that have been tested for security.
MPAI Store’s role –          Tests the Conformance of Implementations to MPAI-AIF.

–          Verifies Implementations’ security, e.g., absence of malware.

–          Indicates unambiguously that Implementations are Level 1.

Level 2 Interoperability

In a Level 2 Implementation, the AIW shall be an Implementation of an MPAI Use Case and the AIMs shall conform with an MPAI Applicati­on Standard.

 

Implementers’ benefits Upload to the MPAI Store and have globally distributed Implementations of

–          AIFs conforming to MPAI-AIF.

–          AIWs and AIMs conforming to MPAI Application Standards.

Users’ benefits –          Rely on Implementations of AIWs and AIMs whose Functions have been reviewed during standardisation.

–          Have a degree of Explainability of the AIW operation because the AIM Functions and the data Formats are known.

Market’s benefits –          Open AIW and AIM markets foster competition leading to better products.

–          Competition of AIW and AIM Implementations fosters AI innovation.

MPAI Store’s role –          Tests Conformance of Implementations with the relevant MPAI Standard.

–          Verifies Implementations’ security.

–          Indicates unambiguously that Implementations are Level 2.

 

Level 3 Interoperability

MPAI does not generally set standards on how and with what data an AIM should be trained. This is an important differentiator that promotes competition leading to better solutions. However, the performance of an AIM is typically higher if the data used for training are in greater quantity and more in tune with the scope. Training data that have large variety and cover the spec­trum of all cases of interest in breadth and depth typically lead to Implementations of higher “quality”.

For Level 3, MPAI normatively specifies the process, the tools and the data or the characteristics of the data to be used to Assess the Grade of Performance of an AIM or an AIW.

 

Implementers’ benefits May claim their Implementations have passed Performance Assessment.
Users’ benefits Get assurance that the Implementation being used performs correctly, e.g., it has been properly trained.
Market’s benefits Implementations’ Performance Grades stimulate the development of more Performing AIM and AIW Implementations.
MPAI Store’s role –          Verifies the Implementations’ security

–          Indicates unambiguously that Implementations are Level 3.

 

The MPAI ecosystem

The following Figure 4 is a high-level description of the MPAI ecosystem operation applicable to fully conforming MPAI implementations as specified in the Governance of the MPAI Ecosystem Specification [26]:

  1. MPAI establishes and controls the not-for-profit MPAI Store.
  2. MPAI appoints Performance Assessors.
  3. MPAI publishes Standards.
  4. Implementers submit Implementations to Performance Assessors.
  5. If the Implementation Performance is acceptable, Performance Assessors inform Implementers and MPAI Store.
  6. Implementers submit Implementations to the MPAI Store
  7. MPAI Store verifies security and Tests Confor­mance of Implementation.
  8. Users download Implementations and report their experience to MPAI.

 

Figure 4 – The MPAI ecosystem operation

Applications (Informative

  • Annex 4 Applications (Informative)

 

When different Controllers running on separate computing platforms (Swarm Elements) interact with one another, they cooperate by requesting one or more Controllers in range to open Remote Ports. The Controllers on which the Remote Ports are opened can then react to information sent by other Controllers in range through the Remote Ports and implement a collective behaviour of choice. For instance: there is a main Controller and the other Controllers in the swarm react to the information it sends; or there is no main Controller and all Controllers in the swarm behave according to a collective logic specified in the programming of all Controllers.

 

 

  • Annex 5 Patent declarations

The MPAI Artificial Intelligence Framework (MPAI-AIF) Technical Specification has been developed according to the process outlined in the MPAI Statutes [9] and the MPAI Patent Policy [10].

The following entities have agreed to licence their standard essential patents reading on the MPAI Artificial Intelligence Framework (MPAI-AIF) Technical Specification according to the MPAI-AIF Framework Licence [11]:

 

Entity Name email address

 

 

  • Annex 6 Threat Models

 

D.1.1 System definition
D.1.2 Assets and stakeholders
D.1.3 Security goals
D.2 Threat Model
D.2.1 Adversarial models
D.2.2 Threats and attacks

D.2.3 Risk assessment

D.3 Mitigations

D.3.1 Objectives

D.3.2 Requirements

D.4 Remediation & residual risk

D.4.1 Implementation remediations

D.4.2 Residual risk

 

 

  • Annex 7 Use Cases

1          Secure communication via Network Security (TLS)

Examples of secure communication:

  1. User Agent communicates with AIF.
  2. AIF communicates with MPAI Store.
  3. AIF communicates with external entity (not MPAI Store)

1.1        Secure Storage

An AIM needs to store securely data in AIW/AIM Storage and Global Storage. The latter is used when there is more than one AIM. The AIF needs to store a downloaded AIM securely.

1.2        Network Credentials (authentication)

The AIF authenticates itself with the MPAI Store, User, etc.

1.3        Attestation

Every secure interaction, e.g., Controller talks with User Agent, two AIFs talking together in a secure way, requires attestation.

1.4        MPAI Store Provisioning

At the initial setup of AIW and AIMs.

Other ways of provisioning than the MPAI Store? We may find more ways when we are done with security.

2          Workflow

 

Workflow to obtain an AIW from the MPAI Store. All operations are secure.

 

Caller call Responder

  1. Caller:
    1. Calls its internal secure communication library to open a TCP/IP connection on HTTPS with Responder listening in port 443.
    2. Sends a hello message to Responder.
  2. Responder responds with a certificate.
  3. Caller:
    1. Validates certificate.
    2. Responds with a certificate.
  4. Responder:
    1. Validates UA certificate.
    2. Requests list of cyphers with priorities from Trusted Service Encryption Engine.
  5. Trusted Service Encryption Engine:
    1. picks the one with highest priority.
    2. Sends selected cypher.
  6. Responder Sends with https:
    1. Close hello with ack.
    2. If Use Attestation = .true. then call Attestation Service in the Trusted Service API

 

  1. S-Controller is listening on port 443.
  2. User Agent (Caller) calls S-Controller (Responder).
  3. User Agent requests via RESTful API S-Controller to retrieve AIW from MPAI Store.
  4. S-Controller (Caller) calls MPAI Store (responder).
  5. S-Controller requests via RESTful API MPAI Store to return the AIW.
  6. If attestation is accepted, S-Controller:
  7. Do
    1. Receives chunk of data
    2. Secure part of controller calls Encryption Service
    3. Decrypts the data received from MPAI Store
    4. Stores decrypted data in the Secure Storage included in the AIW/AIM Storage
    5. Secure part of controller checks end of data

enddo

  1. S-Controller:
    1. Calls Encryption Service.
    2. Signals to UA that AIW transfer is complete.
    3. Unzips AIW.
    4. Parses JSON metadata.
    5. Instantiates AIMs.
    6. Initialises AIW.
    7. Starts the AIW per JSON file.
  2. UA sends suspend (secure).

 

 

[1] At the time of publication of this standard, the MPAI Store was assigned as the IIDRA.


The MPAI 2022 Calls for Technologies – Part 1 (AI Framework)

A foundational element of the MPAI architecture is the fact that monolithic AI applications have some characteristics that make them undesirable. For instance, they are single-use, i.e., it is hard to reuse technologies used by the application in another application and they are obscure, i.e., it is hard to understand why a machine has produced a certain output when subjected to a certain input. The first characteristic means that it is hard to make complex applications because an implementer must possess know-how of all features of the applications and the second is that they often are “unexplainable”.

MPAI launched AI Framework (AIF), its first official standardisation activity in December 2020, less than 3 months after its establishment. AIF is a standard environment where it is possible to execute AI Workflows (AIW) composed of AI Modules (AIM). Both AIWs and AIMs are defined by their function and their interfaces. AIF is unconcerned by the technology used by an AIM but needs to know the topology of an AIW.

Ten months later (October 2021)the MPAI-AIF standard was approved. Its structure is represented by Figure 1.

Figure 1 – The MPAI-AIF Reference Model

MPAI’s AI Framework (MPAI-AIF) specifies the architecture, interfaces, protocols, and Application Programming Interfaces (API) of the AI Framework (AIF), an environment specially designed for execution of AI-based implementations, but also suitable for mixed AI and traditional data processing workflows.

The AIF, the AIW and the AIMs are represented by JSON Metadata. The User Agent and the AIMs call the Controller through a set of standard APIs. Likewise, the Controller calls standard APIs to interact with Communication (a service for inter-AIM communication), Global Storage (a service for AIMs to store data for access by other AIMs) and the MPAI Store (a service for downloading AIMs required by an AIW). Access represents access to application-specific data.

Through the JSON Metadata, an AIF with appropriate resources (specified in the AIF JSON Metadata) can execute an AIW requiring AIMs (specified in the AIF JSON Metadata) that can be downloaded from the MPAI Store.

The MPAI-AIF standard has the following main features:

  1. Independence of the Operating System.
  2. Modular component-based architecture with specified interfaces.
  3. Encapsulation of component interfaces to abstract them from the development environment.
  4. Interface with the MPAI Store enabling access to validated components.
  5. Component can be implemented as software, hardware or mixed hardware-software.
  6. Components: execute in local and distributed Zero-Trust architectures, can interact with other implementations operating in proximity and support Machine Learning functionalities.

The MPAI-AIF standard achieves much of the original MPAI vision because AI applications:

  1. Need not be monolithic but can be composed of independently developed modules with standard interfaces
  2. Are more explainable
  3. Can be found in an open market.

Feature #6 above is a requirement, but the standard does not provide practical means for an application developer to ensure that the execution of the AIW takes place in a secure environment. Version 2 of MPAI-AIF intends to provide exactly that. As MPAI-AIF V1 does not specify any trusted service that an implementer can rely on, MPAI-AIF V2 identifies specific trusted services supporting the implementation of a Trusted Zone meeting a set of functional requirements that enable AIF Components to access trusted services via APIs, such as:

  1. AIM Security Engine.
  2. Trusted AIM Model Services
  3. Attestation Service.
  4. Trusted Communication Service.
  5. Trusted AIM Storage Service
  6. Encryption Service.

Figure 1 represents the Reference Models of MPAI-AIF V2.

Figure 2 – Reference Models of MPAI-AIF V2

The AIF Components shall be able to call Trusted Services APIs after establishing the developer-specified security regime based on the following requirements:

  1. The AIF Components shall access high-level implementation-independent Trusted Services API to handle:
    1. Encryption Service.
    2. Attestation Service.
    3. Trusted Communication Service.
    4. Trusted AIM Storage Service including the following functionalities:
      1. AIM Storage Initialisation (secure and non-secure flash and RAM)
      2. AIM Storage Read/Write.
      3. AIM Storage release.
    5. Trusted AIM Model Services including the following functionalities:
      1. Secure and non-secure Machine Learning Model Storage.
      2. Machine Learning Model Update (i.e., full, or partial update of the weights of the Model).
      3. Machine Learning Model Validation (i.e., verification that the model is the one that is expected to be used and that the appropriate rights have been acquired).
    6. AIM Security Engine including the following functionalities:
      1. Machine Learning Model Encryption.
      2. Machine Learning Model Signature.
      3. Machine Learning Model Watermarking.
  2. The AIF Components shall be easily integrated with the above Services.
  3. The AIF Trusted Services shall be able to use hardware and OS security features already existing in the hardware and software of the environment in which the AIF is implemented.
  4. Application developers shall be able to select the application’s security either or both by:
    1. Level of security that includes a defined set of security features for each level, i.e., APIs are available to either select individual security services or to select one of the standard security levels available in the implementation.
    2. Developer-defined security, i.e., a combination of a developer-defined set of security features.
  5. The specification of the AIF V2 Metadata shall be an extension of the AIF V1 Metadata supporting security with either or both standardised levels and a developer-defined combination of security features.
  6. MPAI welcomes the submission of use cases and their respective threat models.

MPAI has rigorously followed its standard development process in producing the Use Cases and Functional Requirements summarised in this post. MPAI has additionally produced The Commercial Requirements (Framework Licence) and the text of the Call for Technologies.

Below are a few useful links for those wishing to know more about the MPAI-AIF V2 Call for Technologies and how to respond to it:

  1. The “About MPAI-AIF” web page provides some general information about MPAI-AIF.
  2. The MPAI-AIF V1 standard can be downloaded from here.
  3. The 1 min 20 sec video (YouTube and (non-YouTube) concisely illustrates the MPAI-AIFV2 Call for Technologies.
  4. The slides and the video recording of the online presentation (YouTubenon-YouTube) made at the 11 July online presentation give a complete overview of MPAI-AIF V2.

The MPAI secretariat shall receive the responses to the MPAI-AIF V2 Call for Technologies by 10 October 2022 at 23:39 UTC. For any need, please contact the MPAI secretariat.

 


MPAI-AIF V2 Use Cases and Functional Requirements

The MPAI-AIF  V2 Use Cases and Functional Requirements is also available as a Word document

1          Introduction1

2          Use Cases2

3          Terms specific to MPAI-AIF V22

4          Functional Requirements3

5          References4

5.1             Normative References4

5.2             Informative References4

Annex 1 – MPAI-wide terms and definitions5

Annex 2 – Notices and Disclaimers Concerning MPAI Standards (Informative)

Annex 3 – The Governance of the MPAI Ecosystem (Informative)

1        Introduction

In recent years, Artificial Intelligence (AI) and related technologies have been applied to a broad range of applications, have started affecting the life of millions of people and are expected to do so even more in the future. As digital media standards have positively influenced industry and billions of people, so AI-based data coding standards are expected to have a similar positive impact. Indeed, research has shown that data coding with AI-based technologies is generally more efficient than with existing technologies for, e.g., compression and feature-based description.

However, some AI technologies may carry inherent risks, e.g., in terms of bias toward some classes of users. Therefore, the need for standardisation is more important and urgent than ever.

The international, unaffiliated, not-for-profit MPAI – Moving Picture, Audio and Data Coding by Artificial Intelligence Standards Developing Organisation has the mission to develop AI-enabled data coding standards. MPAI Application Standards enable the development of AI-based products, applications, and services.

As a part of its mission, MPAI has developed standards operating procedures to enable users of MPAI implementations to make informed decision about their applicability. Central to this is the notion of Performance, defined as a set of attributes characterising a reliable and trustworthy implementation.

For the aforementioned reasons, to fully achieve the MPAI mission, Technical Specifications must be complemented by an ecosystem designed, created and managed to underpin the life cycle of MPAI standards through the steps of specification, technical testing, assessment of product safety and security, and distribution.

In the following, Terms beginning with a capital letter are defined in Table 1 if they are specific to this Standard and in Table 2 if they are common to all MPAI Standards.

The MPAI Ecosystem is fully specified in [1]. It is composed of:

  • MPAI as provider of Technical, Conformance and Performance Specifications.
  • Implementers of MPAI standards.
  • MPAI-appointed Performance Assessors.
  • The MPAI Store which takes care of secure distribution of validated Implementations.
  • Users of MPAI Standard Implementations.

Figure 1 depicts Version 1 of the MPAI-AIF Reference Model under which Implementations of MPAI Application Standards and user-defined MPAI-AIF conforming applications operate.

An AIF Implementation allows execution of AI Workflows (AIW), composed of basic processing elements called AI Modules (AIM).

Figure 1 – The AI Framework (AIF) Reference Model and its Components

MPAI Application Standards normatively specify Syntax and Semantics of the input and output data and the Function of the AIW and the AIMs, and the Connections between and among the AIMs of an AIW.

In particular, an AIM is defined by its Function and data, but not by its internal architecture, which may be based on AI or data processing, and implemented in software, hardware or hybrid software and hardware technologies.

MPAI defines Interoperability as the ability to replace an AIW or an AIM Implementation with a functionally equivalent Implementation. MPAI also defines 3 Interoperability Levels of an AIW executed in an AIF:

Level 1 – Implementer-specific and satisfying the MPAI-AIF Standard.

Level 2 – Specified by an MPAI Application Standard.

Level 3 – Specified by an MPAI Application Standard and certified by a Performance Assessor.

MPAI offers Users access to the promised benefits of AI with a guarantee of increased transparency, trust and reliability as the Interoperability Level of an Implementation moves from 1 to 3. Additional information on Interoperability Levels is provided in Annex 3.

2        Scope of this document

This document specifies the functional requirements of the planned MPAI-AIF V2 standard, an extension of the MPAI-AIF V1.1 standard designed to add a security infrastructure to the AI Framework standard of [2] so that AIF V2 Components can access security services provided by a security infrastructure.

MPAI-AIF V2 will be developed by the MPAI AI Framework Development Committee (AIF-DC).

3        Terms specific to MPAI-AIF V2

Table 1 defines the Terms used in this document whose first letter is capital. The Terms of MPAI-wide applicability are defined in Table 2.

Table 1Terms and definitions

Term Definition
Attestation Service A mechanism for software to prove its identity. The goal of attestation is for a party to prove to a remote party that its operating system and application software are intact and trustworthy. The remote party trusts that attestation data is accurate because it is signed by a Trusted Platform Module (TPM) whose key is certified by the Certification Authority (CA).
Certification Authority A trusted entity that manages and issues security certificates and public keys that are used for secure communication in a public network.
Crypto Engine A self-contained, redundant cryptographic module designed to be integrated into devices.
Root of Trust (RoT) A source that can always be trusted within a cryptographic system. Because cryptographic security is dependent on keys to encrypt and decrypt data and perform functions such as generating digital signatures and verifying signatures, RoT schemes generally include a hardened hardware module.
Threat Model The result of a procedure for optimising application, system or business process security by identifying objectives and vulnerabilities, and then defining countermeasures to prevent or mitigate the effects of threats to the system.
Trusted Service Any of Attestation Service Trusted Communication Service, Trusted Communication Service, and Trusted Storage Service
Trusted Communication Service Any service provided for the purpose of secure transmission of data without regard to the transmission protocol employed, whether or not the transmission medium is public or private.
Trusted Platform Module A specialised hardware and embedded software that is designed to secure hardware with integrated cryptographic keys.
Trusted Storage Service A service providing protected (encrypted) persistent storage.

4        Functional Requirements

AIF V1 is based on the assumption that the whole AI Framework runs in a Trusted Zone without specifying any trusted service that an implementer should follow.

AIF-V2 intends to identify specific trusted services to support the implementation of a Trusted Zone meeting a set of functional requirements by enabling AIF Components to access trusted services via APIs as defined in Table 1, such as:

  1. Encryption Service.
  2. Attestation Service.
  3. Trusted Communication Service.
  4. Trusted AIM Model Services
  5. Trusted AIM Storage Service
  6. AIM Security Engine.

Figure 2 represents the Reference Model of MPAI-AIF V2.

Figure 2 – Reference Model of MPAI-AIF V2

The MPAI-AIF V2 standard shall extend the functionalities specified in the MPAI-AIF V1 standard. Specifically, AIF Components shall be able to call Trusted Services APIs after establishing the developer-specified security security regime based on the following requirements:

  1. The AIF Components shall access high-level implementation-independent Trusted Services API to handle:
    1. Encryption Service.
    2. Attestation Service.
    3. Trusted Communication Service.
    4. Trusted AIM Storage Service including the following functionalities:
      1. AIM Storage Initialisation (secure and non-secure flash and RAM)
      2. AIM Storage Read/Write.
  • AIM Storage release.
  1. Trusted AIM Model Services including the following functionalities:
    1. Secure and non-secure Machine Learning Model Storage.
    2. Machine Learning Model Update (i.e., full or partial update of the weights of the Model).
  • Machine Learning Model Validation (i.e., verification that the model is the one that is expected to be used and that the appropriate rights have been acquired).
  1. AIM Security Engine including the following functionalities:
    1. Machine Learning Model Encryption.
    2. Machine Learning Model Signature.
  • Machine Learning Model Watermarking.
  1. The AIF Components shall be easily integrated with the above Services.
  2. The AIF Trusted Services shall be able to use hardware and OS security features already existing in the hardware and software of the environment in which the AIF is implemented.
  3. Application developers shall be able to select the application’s security either or both by:
    1. Level of security that includes a defined set of security features for each level, i.e., APIs are available to either select individual security services or to select one of the standard security levels available in the implementation.
    2. Developer-defined security, i.e., a combination of developer-defined set of security features.
  4. The specification of the AIF V2 Metadata shall be an extension of the AIF V1 Metadata supporting security with either or both standardised levels and developer-defined combination of security features.
  5. MPAI welcomes the submission of use cases and their respective threat models.

5        References

  1. MPAI Standards Resources; https://mpai.community/standards/resources/.
  2. MPAI Patent Policy; https://mpai.community/about/the-mpai-patent-policy/.
  3. Governance of the MPAI Ecosystem (MPAI-GME); https://mpai.community/standards/resources/#GME.
  4. AI Framework (MPAI-AIF) V1.1; https://mpai.community/standards/resources/#AIF
  5. MPAI-AIF V2 Call for Technologies;https://mpai.community/standards/mpai-aif/call-for-technologies/mpai-aif-v2-call-for-technologies/.
  6. MPAI-AIF V2 Framework Licence;https://mpai.community/standards/mpai-aif/framework-licence/mpai-aif-v2-framework-licence/.
  7. Presentation of MPAI-AIF V2 Use Cases and Functional Requirements;https://platform.wim.tv/#/webtv/convenor/vod/0b55db63-3ef9-4e69-ab02-b08b5a6dec7c.
  8. https://courses.cs.washington.edu/courses/csep590/06wi/finalprojects/bare.pdf

Annex 1 – MPAI-wide terms and definitions

The Terms used in this standard whose first letter is capital and are not already included in Table 1 are defined in Table 2.

Table 2MPAI-wide Terms

Term Definition
Access Static or slowly changing data that are required by an application such as domain knowledge data, data models, etc.
AI Framework (AIF) The environment where AIWs are executed.
AI Module (AIM) A processing element receiving AIM-specific Inputs and producing AIM-specific Outputs according to according to its Function. An AIM may be an aggregation of AIMs.
AI Workflow (AIW) A structured aggregation of AIMs implementing a Use Case receiving AIW-specific inputs and producing AIW-specific inputs according to its Function.
AIF Metadata The data set describing the capabilities of an AIF set by the AIF Implem­enter.
AIM Metadata The data set describing the capabilities of an AIM set by the AIM Implem­enter.
Application Programming Interface (API) A software interface that allows two applications to talk to each other
Application Standard An MPAI Standard specifying AIWs, AIMs, Topologies and Formats suitable for a particular application domain.
Channel A physical or logical connection between an output Port of an AIM and an input Port of an AIM. The term “connection” is also used as a synonym.
Communication The infrastructure that implements message passing between AIMs.
Component One of the 9 AIF elements: Access, AI Module, AI Workflow, Commun­ication, Controller, Internal Storage, Global Storage, MPAI Store, and User Agent.
Conformance The attribute of an Implementation of being a correct technical Implem­entation of a Technical Specification.
Conformance Tester An entity authorised by MPAI to Test the Conformance of an Implementation.
Conformance Testing The normative document specifying the Means to Test the Conformance of an Implementation.
Conformance Testing Means Procedures, tools, data sets and/or data set characteristics to Test the Conformance of an Implementation.
Connection A channel connecting an output port of an AIM and an input port of an AIM.
Controller A Component that manages and controls the AIMs in the AIF, so that they execute in the correct order and at the time when they are needed.
Data Information in digital form.
Data Format The standard digital representation of Data.
Data Semantics The meaning of Data.
Device A hardware and/or software entity running at least one instance of an AIF.
Ecosystem The ensemble of the following actors: MPAI, MPAI Store, Implementers, Conformance Testers, Performance Testers and Users of MPAI-AIF Im­plementations as needed to enable an Interoperability Level.
Event An occurrence acted on by an Implementation.
Explainability The ability to trace the output of an Implementation back to the inputs that have produced it.
Fairness The attribute of an Implementation whose extent of applicability can be assessed by making the training set and/or network open to testing for bias and unanticipated results.
Function The operations effected by an AIW or an AIM on input data.
Global Storage A Component to store data shared by AIMs.
Identifier A name that uniquely identifies an Implementation.
Implementation 1.     An embodiment of the MPAI-AIF Technical Specification, or

2.     An AIW or AIM of a particular Level (1-2-3).

Internal Storage A Component to store data of the individual AIMs.
Interoperability The ability to functionally replace an AIM/AIW with another AIM/AIW having the same Interoperability Level
Interoperability Level The attribute of an AIW and its AIMs to be executable in an AIF Implementation and to be:

1.     Implementer-specific and satisfying the MPAI-AIF Standard (Level 1).

2.     Specified by an MPAI Application Standard (Level 2).

3.     Specified by an MPAI Application Standard and certified by a Performance Assessor (Level 3).

Knowledge Base Structured and/or unstructured information made accessible to AIMs via MPAI-specified interfaces
Message A sequence of Records.
Normativity The set of attributes of a technology or a set of technologies specified by the applicable parts of an MPAI standard.
Performance The attribute of an Implementation of being Reliable, Robust, Fair and Replicable.
Performance Assessment The normative document specifying the procedures, the tools, the data sets and/or the data set characteristics to Assess the Grade of Performance of an Implementation.
Performance Assessment Means Procedures, tools, data sets and/or data set characteristics to Assess the Performance of an Implementation.
Performance Assessor An entity authorised by MPAI to Assess the Performance of an Implementation in a given Application domain
Port A physical or logical communication interface of an AIM.
Profile A particular subset of the technologies used in MPAI-AIF or an AIW of an Application Standard and, where applicable, the classes, other subsets, options and parameters relevant to that subset.
Record Data with a specified structure.
Reference Model The AIMs and theirs Connections in an AIW.
Reference Software A technically correct software implementation of a Technical Specific­ation containing source code, or source and compiled code.
Reliability The attribute of an Implementation that performs as specified by the Application Standard, profile and version the Implementation refers to, e.g., within the application scope, stated limitations, and for the period of time specified by the Implementer.
Replicability The attribute of an Implementation whose Performance, as Assessed by a Performance Assessor, can be replicated, within an agreed level, by another Performance Assessor.
Robustness The attribute of an Implementation that copes with data outside of the stated application scope with an estimated degree of confidence.
Scope The domain of applicability of an MPAI Application Standard.
Service Provider An entrepreneur who offers an Implementation as a service (e.g., a recommendation service) to Users.
Specification A collection of normative clauses.
Standard The ensemble of Technical Specification, Reference Software, Conformance Testing and Performance Assessment of an MPAI application Standard.
Technical Specification (Framework) the normative specification of the AIF.

(Application) the normative specification of the set of AIWs belon­ging to an application domain along with the AIMs required to Im­plem­ent the AIWs that includes:

1.     The formats of the Input/Output data of the AIWs implementing the AIWs.

2.     The Connections of the AIMs of the AIW.

3.     The formats of the Input/Output data of the AIMs belonging to the AIW.

Testing Laboratory A laboratory accredited by MPAI to Assess the Grade of  Performance of Implementations.
Time Base The protocol specifying how AIF Components can access timing information.
Topology The set of AIM Connections of an AIW.
Use Case A particular instance of the Application domain target of an Application Standard.
User A user of an Implementation.
User Agent The Component interfacing the user with an AIF through the Controller
Version A revision or extension of a Standard or of one of its elements.
Zero Trust A cybersecurity model primarily focused on data and service protection that assumes no implicit trust.

Annex 2 – Notices and Disclaimers Concerning MPAI Standards (Informative)

The notices and legal disclaimers given below shall be borne in mind when downloading and using approved MPAI Standards.

In the following, “Standard” means the collection of four MPAI-approved and published documents: “Technical Specification”, “Reference Software” and “Conformance Testing” and, where applicable, “Performance Testing”.

Life cycle of MPAI Standards

MPAI Standards are developed in accordance with the MPAI Statutes. An MPAI Standard may only be developed when a Framework Licence has been adopted. MPAI Standards are developed by especially established MPAI Development Committees who operate on the basis of consensus, as specified in Annex 1 of the MPAI Statutes. While the MPAI General Assembly and the Board of Directors administer the process of the said Annex 1, MPAI does not independently evaluate, test, or verify the accuracy of any of the information or the suitability of any of the technology choices made in its Standards.

MPAI Standards may be modified at any time by corrigenda or new editions. A new edition, however, may not necessarily replace an existing MPAI standard. Visit the web page to determine the status of any given published MPAI Standard.

Comments on MPAI Standards are welcome from any interested parties, whether MPAI members or not. Comments shall mandatorily include the name and the version of the MPAI Standard and, if applicable, the specific page or line the comment applies to. Comments should be sent to the MPAI Secretariat. Comments will be reviewed by the appropriate committee for their technical relevance. However, MPAI does not provide interpretation, consulting information, or advice on MPAI Standards. Interested parties are invited to join MPAI so that they can attend the relevant Development Committees.

Coverage and Applicability of MPAI Standards

MPAI makes no warranties or representations of any kind concerning its Standards, and expressly disclaims all warranties, expressed or implied, concerning any of its Standards, including but not limited to the warranties of merchantability, fitness for a particular purpose, non-infringement etc. MPAI Standards are supplied “AS IS”.

The existence of an MPAI Standard does not imply that there are no other ways to produce and distribute products and services in the scope of the Standard. Technical progress may render the technologies included in the MPAI Standard obsolete by the time the Standard is used, especially in a field as dynamic as AI. Therefore, those looking for standards in the Data Compression by Artificial Intelligence area should carefully assess the suitability of MPAI Standards for their needs.

IN NO EVENT SHALL MPAI BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO: THE NEED TO PROCURE SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE PUBLICATION, USE OF, OR RELIANCE UPON ANY STANDARD, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE AND REGARDLESS OF WHETHER SUCH DAMAGE WAS FORESEEABLE.

MPAI alerts users that practicing its Standards may infringe patents and other rights of third parties. Submitters of technologies to this standard have agreed to licence their Intellectual Property according to their respective Framework Licences.

Users of MPAI Standards should consider all applicable laws and regulations when using an MPAI Standard. The validity of Conformance Testing is strictly technical and refers to the correct implementation of the MPAI Standard. Moreover, positive Performance Assessment of an implementation applies exclusively in the context of the MPAI Governance and does not imply compliance with any regulatory requirements in the context of any jurisdiction. Therefore, it is the responsibility of the MPAI Standard implementer to observe or refer to the applicable regulatory requirements. By publishing an MPAI Standard, MPAI does not intend to promote actions that are not in compliance with applicable laws, and the Standard shall not be construed as doing so. In particular, users should evaluate MPAI Standards from the viewpoint of data privacy and data ownership in the context of their jurisdictions.

Implementers and users of MPAI Standards documents are responsible for determining and complying with all appropriate safety, security, environmental and health and all applicable laws and regulations.

Copyright

MPAI draft and approved standards, whether they are in the form of documents or as web pages or otherwise, are copyrighted by MPAI under Swiss and international copyright laws. MPAI Standards are made available and may be used for a wide variety of public and private uses, e.g., implementation, use and reference, in laws and regulations, and standardisation. By making these documents available for these and other uses, however, MPAI does not waive any rights in copyright to its Standards. For inquiries regarding the copyright of MPAI standards, please contact the MPAI Secretariat.

The Reference Software of an MPAI Standard is released with the MPAI Modified Berkeley Software Distribution licence. However, implementers should be aware that the Reference Software of an MPAI Standard may reference some third party software that may have a different licence.

Annex 3 – The Governance of the MPAI Ecosystem (Informative)

 

Level 1 Interoperability

With reference to Figure 1, MPAI issues and maintains a standard – called MPAI-AIF – whose components are:

  1. An environment called AI Framework (AIF) running AI Workflows (AIW) composed of inter­connected AI Modules (AIM) exposing standard interfaces.
  2. A distribution system of AIW and AIM Implementation called MPAI Store from which an AIF Implementation can download AIWs and AIMs.

A Level 1 Implementation shall be an Implementation of the MPAI-AIF Technical Specification executing AIWs composed of AIMs able to call the MPAI-AIF APIs.

Implementers’ benefits Upload to the MPAI Store and have globally distributed Implementations of

–       AIFs conforming to MPAI-AIF.

–       AIWs and AIMs performing prop­rietary functions executable in AIF.

Users’ benefits Rely on Implementations that have been tested for security.
MPAI Store’s role –       Tests the Conformance of Implementations to MPAI-AIF.

–       Verifies Implementations’ security, e.g., absence of malware.

–       Indicates unambiguously that Implementations are Level 1.

Level 2 Interoperability

In a Level 2 Implementation, the AIW must be an Implementation of an MPAI Use Case and the AIMs must conform with an MPAI Applicati­on Standard.

Implementers’ benefits Upload to the MPAI Store and have globally distributed Implementations of

–       AIFs conforming to MPAI-AIF.

–       AIWs and AIMs conforming to MPAI Application Standards.

Users’ benefits –       Rely on Implementations of AIWs and AIMs whose Functions have been reviewed during standardisation.

–       Have a degree of Explainability of the AIW operation because the AIM Func­tions and the data  Formats are known.

Market’s benefits –       Open AIW and AIM markets foster competition leading to better products.

–       Competition of AIW and AIM Implementations fosters AI innovation.

MPAI Store’s role –       Tests Conformance of Implementations with the relevant MPAI Standard.

–       Verifies Implementations’ security.

–       Indicates unambiguously that Implementations are Level 2.

Level 3 Interoperability

MPAI does not generally set standards on how and with what data an AIM should be trained. This is an important differentiator that promotes competition leading to better solutions. However, the performance of an AIM is typically higher if the data used for training are in greater quantity and more in tune with the scope. Training data that have large variety and cover the spec­trum of all cases of interest in breadth and depth typically lead to Implementations of higher “quality”.

For Level 3, MPAI normatively specifies the process, the tools and the data or the characteristics of the data to be used to Assess the Grade of Performance of an AIM or an AIW.

Implementers’ benefits May claim their Implementations have passed Performance Assessment.
Users’ benefits Get assurance that the Implementation being used performs correctly, e.g., it has been properly trained.
Market’s benefits Implementations’ Performance Grades stimulate the development of more Performing AIM and AIW Implementations.
MPAI Store’s role –       Verifies the Implementations’ security

–       Indicates unambiguously that Implementations are Level 3.

The MPAI ecosystem

The following is a high-level description of the MPAI ecosystem operation applicable to fully conforming MPAI implementations:

  1. MPAI establishes and controls the not-for-profit MPAI Store.
  2. MPAI appoints Performance Assessors.
  3. MPAI publishes Standards.
  4. Implementers submit Implementations to Performance Assessors.
  5. If the Implementation Performance is acceptable, Performance Assessors inform Implementers and the MPAI Store.
  6. Implementers submit Implementations to the MPAI Store tested for Confor­mance and security.
  7. Users download and use Implementations, and submit experience scores.

Figure 3 – The MPAI ecosystem operation

 


MPAI-AIF V1 Use Cases and Functional Requirements

1       Introduction

2       Use Cases. 1

2.1       Context-based Audio Enhancement (MPAI-CAE) 2

2.2       Integrative Genomic/Sensor Analysis (MPAI-GSA) 3

2.3       AI-Enhanced Video Coding (MPAI-EVC) 6

2.4       Server-based Predictive Multiplayer Gaming (MPAI-SPG) 7

2.5       Multi-Modal Conversation (MPAI-MMC) 8

2.6       Compression and Understanding of Industrial data (MPAI-CUI) 9

3       Architecture. 10

4       Requirements. 11

4.1       Component requirements. 11

4.2       Systems requirements. 11

4.3       General requirements. 12

5       Conclusions. 12

6       References. 12

1        Introduction

Moving Picture, Audio and Data Coding by Artificial Intelligence (MPAI) is an international association with the mission to develop AI-enabled data coding standards. Artificial Intelligence (AI) technologies have shown they can offer more efficient data coding than existing technol­ogies.

MPAI has analysed six use cases covering applic­ation areas benefiting from AI technol­ogies. Even though use cases are disparate, each of them can be implemented with a combination of processing modules performing functions that concur to achieving the inten­ded result.

MPAI has assessed that, leaving it to the market to develop individual implementations, would multiply costs and delay adoption of AI technologies, while modules with standard interfaces, combined and executed within the MPAI-specified AI Framework, will favour the emergence of horizontal markets where proprietary and competing module implemen­tations exposing standard interfaces will reduce cost, promote adoption and incite progress of AI technologies. MPAI calls these modules AI Modules (AIM).

MPAI calls the planned AI Framework standard as MPAI-AIF. As AI is a fast-moving field, MPAI expects that MPAI-AIF will be extended as new use cases will bring new requirements and new technologies will reach maturity.

To avoid the deadlock experienced in other high-technology fields, before engaging in the development of the the MPAI-AIF standard, MPAI will develop a Frame­work Licence (FWL) associated with the MPAI-AIF Architecture and Functional Requir­ements defined in this document. The FWL, essentially the business model that standard essential patent (SEP) holders will apply to monetise their Intellectual Properties (IP), but without values such as the amount or percentage of royalties or dates due, will act as Commercial Requirements for the standard and provide a clear IPR licensing framework.

This document contains a summary description of the six use cases (Section 2) followed by a section describing the architecture expected to become normative (Section 3). Section 4 lists the normative requirements identified so far.

2        Use Cases

The six use cases considered cover a broad area of application. Therefore, it is expected that the MPAI-AIF architecture can support a wide range of use cases of practical interest.

Each case is identified by its name and the acronym identifying the future MPAI standard. More information about MPAI-AIF can be found in [1].

2.1       Context-based Audio Enhancement (MPAI-CAE)

The overall user experience quality is highly dependent on the context in which audio is used, e.g.

  1. Entertainment audio can be consumed in the home, in the car, on public transport, on-the-go (e.g. while doing sports, running, biking) etc.
  2. Voice communications: can take place in the office, in the car, at home, on-the-go etc.
  3. Audio and video conferencing can be done in the office, in the car, at home, on-the-go etc.
  4. (Serious) gaming can be done in the office, at home, on-the-go etc.
  5. Audio (post-)production is typically done in the studio
  6. Audio restoration is typically done in the studio

By using context information to act on the content using AI, it is possible substantially to improve the user experience.

The following examples describe how MPAI-CAE can make the difference.

  1. Enhanced audio experience in a conference call

Often, the user experience of a video/audio conference can be marginal. Too much background noise or undesired sounds can lead to participants not understanding what participants are saying. By using AI-based adaptive noise-cancellation and sound enhancement, MPAI-CAE can virtually eliminate those kinds of noise without using complex microphone systems to capture environment characteristics.

  1. Pleasant and safe music listening while biking

While biking in the middle of city traffic, AI can process the signals captured from the environment by the microphones available in many earphones and earbuds (for active noise cancellation), adapt the sound rendition to the acoustic environment, provide an enhanced audio experience (e.g. performing dynamic signal equalization), improve battery life and selectively recognize and allow relevant environment sounds (i.e. the horn of a car). The user enjoys a satisfactory listening experience without losing contact with the acoustic surroundings.

  1. Emotion enhanced synthesized voice

Speech synthesis is constantly improving and finding several applications that are part of our daily life (e.g. intelligent assistants). In addition to improving the ‘natural sounding’ of the voice, MPAI-CAE can implement expressive models of primary emotions such as fear, happiness, sad­ness, and anger.

  1. Efficient 3D sound

MPAI-CAE can reduce the number of channels (i.e. MPEG-H 3D Audio can support up to 64 loudspeaker channels and 128 codec core channels) in an automatic (unsupervised) way, e.g. by mapping a 9.1 to a 5.1 or stereo (radio broadcasting or DVD), maintaining the musical touch of the composer.

  1. Speech/audio restoration

Audio restoration is often a time-consuming process that requires skilled audio engineers with specific experience in music and recording techniques to go over manually old audio tapes. MPAI-CAE can automatically remove anomalies from recordings through broadband denoising, declicking and decrackling, as well as removing buzzes and hums and performing spectrographic ‘retouching’ for removal of discrete unwanted sounds.

  1. Normalization of volume across channels/streams

Eighty-five years after TV has been first introduced as a public service, TV viewers are still strug­gling to adapt to their needs the different average audio levels from different broadcasters and, within a program, to the different audio levels of the different scenes.

MPAI-CAE can learn from user’s reactions via remote control, e.g. to a loud spot, and control the sound level accordingly.

  1. Automotive

Audio systems in cars have steadily improved in quality over the years and continue to be integrated into more critical applications. Today, a buyer takes it for granted that a car has a good automotive sound system. In addition, in a car there is usually at least one and sometimes two microphones to handle the voice-response system and the hands-free cell-phone capability. If the vehicle uses any noise cancellation, several other microphones are involved. MPAI-CAE can be used to improve the user experience and enable the full quality of current audio systems by reduc­ing the effects of the noisy automotive environment on the signals.

  1. Audio mastering

Audio mastering is still considered as an ‘art’ and the prerogative of pro audio engineers. Normal users can upload an example track of their liking (possibly obtained from similar musical content) and MPAI-CAE analyzes it, extracts key features and generate a master track that ‘sounds like’  the example track starting from the non-mastered track.  It is also possible to specify the desired style without an example and the original track will be adjusted accordingly.

More details on MPAI-CAE are found in [2,7]

2.2       Integrative Genomic/Sensor Analysis (MPAI-GSA)

Most experiment in quantitative genomics consist of a setup whereby a small amount of metadata – observable clinical score or outcome, desirable traits, observed behaviour – is correlated with, or modelled from, a set of data-rich sources. Such sources can be:

  1. Biological experiments – typically sequencing or proteomics/metabolomics data
  2. Sensor data – coming from images, movement trackers, etc.

All these data-rich sources share the following properties:

  1. They produce very large amounts of “primary” data as output
  2. They need “primary”, experiment-dependent, analysis, in order to project the primary data (1) onto a single point in a “secondary”, processed space with a high dimensionality – typically a vector of thousands of values
  3. The resulting vectors, one for each experiment, are then fed to some machine or statistical learning framework, which correlates such high-dimensional data with the low-dimensional metadata available for the experiment. The typical purpose is to either model the high-dimensional data in order to produce a mechanistic explanation for the metadata, or to produce a predictor for the metadata out of the high-dimensional data.
  4. Although that is not typically necessary, in some circumstances it might be useful for the statistical or machine learning algorithm to be able to go back to the primary data (1), in order to extract more detailed information than what is available as a summary in the processed high-dimensional vectors produced in (2).

It would be extremely beneficial to provide a uniform framework to:

  1. Represent the results of such complex, data-rich, experiments, and
  2. Specify the way the input data is processed by the statistical or machine learning stage

Although the structure above is common to a number of experimental setups, it is conceptual and never made explicit. Each “primary” data source can consist of heterogeneous information represented in a variety of formats, especially when genomics experiments are considered, and the same source of information is usually represented in different ways depending on the analysis stage – primary or secondary. That results in data processing workflows that are ad-hoc – two experiments combining different sets of sources will require two different workflows able to process each one a specific combination of input/output formats. Typically, such workflows will also be layered out as a sequence of completely separated stages of analysis, which makes it very difficult for the machine or statistical learning stage to go back to primary data when that would be necessary.

MPAI-GSA aims to create an explicit, general and reusable framework to express as many different types of complex integrative experiments as possible. That would provide (I) a compressed, optimized and space-efficient way of storing large integrative experiments, but also (II) the possibility of specifying the AI-based analysis of such data (and, possibly, primary analysis too) in terms of a sequence of pre-defined standardized algorithms. Such computational blocks might be partly general and prior-art (such as standard statistical algorithms to perform dimensional reduction) and partly novel and problem-oriented, possibly provided by commercial partners. That would create a healthy arena whereby free and commercial methods could be combined in a number of application-specific “processing apps”, thus generating a market and fostering innovation. A large number of actors would ultimately benefit from the MPAI-GSA standard – researchers performing complex experiments, companies providing medical and commercial services based on data-rich quantitative technologies, and the final users who would use instances of the computational framework as deployed “apps”.

The following examples describe typical uses of the MPAI-GSA framework.

  1. Medical genomics – sequencing and variant-calling workflows

In this use case, one would like to correlate a list of genomic variants present in humans and having a known effect on health (metadata) with the variants present in a specific individual (secondary data). Such variants are derived from sequencing data for the individual (primary data) on which some variant calling workflow has been applied. Notably, there is an increasing number of companies doing just that as their core business. Their products differ by: the choice of the primary processing workflow (how to call variants from the sequencing data for the individual); the choice of the machine learning analysis (how to establish the clinical importance of the variants found); and the choice of metadata (which databases of variants with known clinical effect to use). It would be easy to re-deploy their workflows as MPAI-GSA applications.

  1. Integrative analysis of ‘omics datasets

In this use case, one would like to correlate some macroscopic variable observed during a biological process (e.g. the reaction to a drug or a vaccine – metadata) with changes in tens of thousands of cell markers (gene expression estimated from RNA; amount of proteins present in the cell – secondary data) measured through a combination of different high-throughput quantitative biological experiments (primary data – for instance, RNA-sequencing, ChIP-sequencing, mass spectrometry). This is a typical application in research environments (medical, veterinary and agricultural). Both primary and secondary analysis are performed with a variety of methods depending on the institution and the provider of bioinformatics services. Reformulating such methods in terms of MPAI-GSA would help reproducibility and standardisation immensely. It would also provide researchers with a compact way to store their heterogeneous data.

  1. Single-cell RNA-sequencing

Similar to the previous one, but in this case at least one of the primary data sources is RNA-sequencing performed at the same time on a number (typically hundred of thousands) of different cells – while bulk RNA sequencing mixes together RNAs coming from several thousands of different cells, in single-cell RNA sequencing the RNAs coming from each different cell are separately barcoded, and hence distinguishable. The DNA barcodes for each cell would be metadata here. Cells can then be clustered together according to the expression patterns present in the secondary data (vectors of expression values for all the species of RNA present in the cell) and, if sufficient metadata is present, clusters of expression patterns can be associated with different types/lineages of cells – the technique is typically used to study tissue differentiation. A number of complex algorithms exist to perform primary analysis (statistical uncertainty in single-cell RNA-sequencing is much bigger than in bulk RNA-sequencing) and, in particular, secondary AI-based clustering/analysis. Again, expressing those algorithms in terms of MPAI-GSA would make them much easier to describe and much more comparable. External commercial providers might provide researchers with clever modules to do all or part of the machine learning analysis.

  1. Experiments correlating genomics with animal behaviour

In this use case, one wants to correlate animal behaviour (typically of lab mice) with their genetic profile (case of knock-down mice) or the previous administration of drugs (typically encountered in neurobiology). Hence primary data would be video data from cameras tracking the animal; secondary data would be processed video data in the form of primitives describing the animal’s movement, well-being, activity, weight, etc.; and metadata would be a description of the genetic background of the animal (for instance, the name of the gene which has been deactivated) or a timeline with the list and amount of drugs which have been administered to the animal. Again, there are several companies providing software tools to perform some or all of such analysis tasks – they might be easily reformulated in terms of MPAI-GSA applications.

  1. Spatial metabolomics

One of the most data-intensive biological protocols nowadays is spatial proteomics, whereby in-situ mass-spec/metabolomics techniques are applied to “pixels”/”voxels” of a 2D/3D biological sample in order to obtain proteomics data at different locations in the sample, typically with sub-cellular resolution. This information can also be correlated with pictures/tomograms of the sample, to obtain phenotypical information about the nature of the pixel/voxel. The combined results are typically analysed with AI-based technique. So primary data would be unprocessed metabolomics data and images, secondary data would be processed metabolomics data and cellular features extracted from the images, and metadata would be information about the sample (source, original placement within the body, etc.). Currently the processing of spatial metabolomics data is done through complex pipelines, typically in the cloud – having these as MPAI-GSA applications would be beneficial to both the researchers and potential providers of computing services.

  1. Smart farming

During the past few years, there has been an increasing interest in data-rich techniques to optimise livestock and crop production (so called “smart farming”). The range of techniques is constantly expanding, but the main ideas are to combine molecular techniques (mainly high-throughput sequencing and derived protocols, such as RNA-sequencing, ChIP-sequencing, HiC, etc.; and mass-spectrometry – as per the ‘omics case at point 2) and monitoring by images (growth rate under different conditions, sensor data, satellite-based imaging) for both livestock species and crops. So this use case can be seen as a combination of cases 2 and 4. Primary sources would be genomic data and images; secondary data would be vectors of values for a number of genomic tags and features (growth rate, weight, height) extracted from images; metadata would be information about environmental conditions, spatial position, etc. A growing number of companies are offering services in this area – again, having the possibility of deploying them as MPAI-GSA applications would open up a large arena where academic or commercial providers would be able to meet the needs of a number of customers in a well-defined way.

More details on MPAI-GSA are found in [3,8].

2.3       AI-Enhanced Video Coding (MPAI-EVC)

MPAI has carried out an investigation on the performance improvement of AI-enhanced HEVC, AI-enhanced VVC and End-to-end AI-based video coding. Preliminary evidence offered by the investigation suggests that by replacing and/or enhancing existing sel­ected HEVC and VVC coding tools with AI-based tools, the objectively measured compres­sion performance may be im­proved by up to around 30%. These results were obtained by combining somewhat heterog­en­eous data from experiments reported in the liter­ature.

The reported initial results, however, do indicate that AI can bring significant im­prov­ements to existing video coding technologies. Therefore, MPAI is investigating the feasibility of improving the coding efficiency by about 25% to 50% over an existing standard with an acceptable increase in complexity using technologies reported in the literature. If the investigation is successful, MPAI will develop a standard called MPAI AI-Enhanced Video Coding (MPAI-EVC).

The investigation showed that encouraging results can be obtained from new types of AI-based coding schemes – called end-to-end. These schemes, while promising, still need substantial more research.

MPAI is also aware of ongoing research targeted at hybrid schemes where AI-based technologies are added to the existing codecs as an enhancement layer without making any change to the base-layer codec itself, thus providing backward-compatible solutions.

At this stage MPAI conducts two parallel activities

  1. Collaborative activity targeting a scientifically sound assessment of the improvements achieved by state-of-the-art research. To the extent possible this should be done with the participation of the authors of major improvements
  2. Thorough development of requirements that the MPAI-EVC should satisfy.

The choice of the starting point (the existing codec), starting from which an AI-enhanced video codec should be developed, is an issue because high-performance video codecs have typically many essential patents (SEP) holders. They should all be convinced to allow MPAI to extend the selected starting point with AI-based tools that satisfy the – still to be defined – MPAI-EVC framework licence. As the result of such an endeavour is not guaranteed, MPAI is planning to pick Essential Video Coding (MPEG-5 EVC) as the starting point. EVC baseline is reported not to be encumbered by IPR.  Additionally, EVC Main Profile is reported to have a limited number of SEP holders. As an EVC patent holder has announced the release of a full implem­entation of EVC as Open Source Software (OSS), the choice of EVC as the starting point would also make available a working code base. The choice between the EVC baseline and main profile is TBD.

The following figures represent the block diagrams of 3 potential configurations to be adopted by the MPAI-EVC standard.

The green circles of Figure 1 indicate traditional video coding tools that could be enhanced or replaced by AI-enabled tools. This will be taken as the basis of the collaborative activity men­tioned above.

Figure 1 – A reference diagram for the Horizontal Hybrid approach

In Figure 2 a traditional video codec is enhanced by an AI Enhancement codec.

Figure 2 – A reference diagram for the Vertical Hybrid approach

More details on MPAI-EVC are found in [4,10.11.12]

2.4       Server-based Predictive Multiplayer Gaming (MPAI-SPG)

There are two basic approaches to online gaming:

  1. Traditional online gaming: the server receives a sequence of data from the client(s) and sends an input-dependent sequence of data to the client(s) which use the data to create appropriate video frames.
  2. Cloud gaming: the server receives a sequence of data from the client(s) and sends an input-dependent sequence of video frames to the client(s). In a cloud gaming scenario, all clients run in the cloud on virtual machines.

In case the connection has temporary high latency or packet loss (in the following called network disruption) two strategies may be used to make up for missing information

  1. Client-side prediction when information from the client does not reach the server or from the server does not reach the client
  2. Server-side prediction when information from the client does not reach the server

In a game a finite state Game machine calculates the logic of the game from the inputs received from the game controller. The client reacts to such user input before the server has acknowledged the input and updated its Game state. If an updated Game state from the server is missing, the client predicts the Game state locally and produces a video frame that is potentially wrong. When the right information from the server reaches the client, the client Game state is reconciliated with the server Game state.

For example, in a first-person shooter game where a fast pace shooting takes place, player A aims their weapon at the position that player B held some milliseconds in the past. Therefore, when Player A fires, player B is long gone. In order to remediate this, when the server gets the information regarding the shooting, which is precise because it carries timestamps, it knows exactly where player A’s the weapon was aiming at and the past position of player B. The server thus processes the shot at that past time, reconciles all the states and updates the clients. In spite of the belated reconciliation, however. the situation is not satisfactory because player B was shot in the past and, in the few milliseconds of differ­ence, player B may have taken cover.

AI and ML can provide more efficient solutions than available today to compensate network disruption and improve user experience to both traditional online gaming and cloud-based gaming.

An AI machine can collect data from multiple clients, perform a much better prediction of each move of each participant and perform sophisticated reconcil­iations. Information from the game engine – inputs from clients and reconciliation info – can be used in the video encoding process to improve the encoding (i.e. motion estimation), thus making possible encoding at higher frame rates for a high-quality gaming exper­ience even on low performing hardware.

Here are two examples of known games that illustrate how MPAI standards can feasibility and user experience.

Example 1: Racing games

During an online racing game, players can see lagging reactions to their moves and an overall low-quality presentation because of network disruption. Usually the information on the screen predic­ted by a client is wrong if the online client information cannot reach the server and the clients involved in the online game on time. In a car racing game, the player may see at time t0 a vehicle going straight to the wall when it is reaching a curve. At time t1, after some seconds, the same vehicle is “teleported” to the actual position yielding an awful player experience.

AI can mitigate this issue, offering a better game experience to players. Data from the different online games are collected and used to predict a meaningful path or the correct behaviour in the time information does not reach the clients.

Example 2: Zombie games

In some traditional online video games, specific information is displayed differently on different clients because it is too onerous to compute all the outcomes of players’ actions in a physically consistent way. An example is provided by zombie games: the result of killing hordes of zombies in each client is visually different from client to client.

A server-based predictive input can support the online architecture and enable it to provide an equal outcome on all clients. In a massive multiplayer hack&slash game, the result of the different combats among players yields the same live visual online experience to each player.

More details on MPAI-SPG are found in [5]

2.5       Multi-Modal Conversation (MPAI-MMC)

A useful application of AI is in the conversational partner which provides the user with information, entertains, chats and answers questions through the speech interface. However, an application should include more than just a speech interface to provide a better service to the user. For example, emotion recognizer and gesture interpreter are needed for better multi-modal interfaces.

Multi-modal conversation (MPAI-MMC) aims to enable human-machine conversation that emulates human-human conversation in completeness and intensity by using AI.

The example of MMC is the conversation between a human user and a computer/robot as in the following list. The input from the user can be voice, text or image or combination of different inputs. Considering emotion of the human user, MMC will output responses in a text, speech, music depending on the user’s needs.

  • Chats: “I am bored. What should I do now?” – “You look tired. Why don’t you take a walk?”
  • Question Answering: “Who is the famous artist in Barcelona?” – “Do you mean Gaudi?”
  • Information Request: “What’s the weather today?” – “It is a little cloudy and cold.”
  • Action Request: “Play some classical music, please” – “OK. Do you like Brahms?”

More details on MPAI-MMC are found in [6]

2.6       Compression and Understanding of Industrial data (MPAI-CUI)

Most economical organizations, e.g., companies, etc., produce large quantities of data, often because these are required by regulation. Users of these data maybe the company itself or Fintech and Insurtech services who need to access the flow of company data to assess and mon­itor financial and organizational performance, as well as the impact of vertical risks (e.g., cyber, seismic, etc.).

The sheer amount of data that need to be exchanged is an issue. Analysing those data by humans is typically on­erous and may miss vitally important information. Artificial intelligence (AI) may help reduce the amount of data with a controlled loss of information and extract the most relevant information from the data. AI is considered the most promising means to achieve the goal.

Unfortunately, the syntax and semantics of the flow of data is high dependent on who has produced the data. The format of the date is typically a text file with a structure not designed for indexing, search and ex­traction. Therefore, in order to be able to apply AI technologies to meaningfully reduce the data flow, it is necessary to standardize the formats of the components of the data flow and make the data “AI friendly”.

Recent regulations are imposing a constant monitoring (ideally monthly). Thus, there is the pos­sibility to have similar blocks of data in temporally consecutive sequences of data.

The company generating the flow data may need to perform compression for its own need (e.g., identifying core and non-core data). Subsequent entities may perform further data compres­sion.

In general, compressed data should allow for easy data search and extraction.

MPAI-CUI may be used in a variety of contexts

  1. To support the company’s board in deploying efficient strategies. A company can analyse its financial performance, identifying possible clues to the crisis or risk of bankruptcy years in advance. It may help the board of directors and decision-makers to make the proper decisions to avoid these situations, conduct what-if analysis, and devise efficient strategies.
  2. To assess the financial health of companies that apply for funds/financial help. A financial institution that receives a request for financial help from a troubled company, can access its financial and organizational data and make an AI-based assessment of that company, as well as a prediction of future performance. This aids the financial institution to take the right decision in funding or not that company, having a broad vision of its situation.
  3. To assess the risk in different fields considering non-core data (e.g., non-financial data). Accurate and targeted sharing of core and non-core data that ranges from the financial and organizational information to other types of risks that affect the business continuity (e.g., environmental, seismic, infrastructure, and cyber).
  4. To analyse the effects of disruptions on the national economy, e.g., performance evaluation by pre/post- pandemic analysis.

3        Architecture

The normative MPAI-AIF architecture enables the creation and automation of mixed ML-AI-DP processing and inference workflows at scale for the use cases considered above. It includes six basic normative elements of the Architecture called Components addressing different modalities of operation – AI, Machine Learning (ML) and Data Processing (DP), data pipelines jungles and computing resource allocations including constrained hardware scenarios of edge AI devices.

The normative reference diagram of MPAI-AIF is given by the following figure where APIs be­tween different Components at different level are shown.

Figure 3 – Proposed normative MPAI-AIF Architecture

  1. Management and Control

Management concerns the activation/disactivation/suspensions of AIMs, while Control supports complex application scenarios.

Management and Control handles simple orchestration tasks (i.e. represented by the execution of a script) and much more complex tasks with a topology of networked AIMs that can be syn­chronised according to a given time base and full ML life cycles.

  1. Execution

The environment where AIMs operate. It is interfaced with Management and Control and with Communication and Storage. It receives external inputs and produces the requested outputs both of which are application specific.

  1. AI Modules (AIM)

AIMs are units comprised of at least the following 3 functions:

  1. The processing element (ML or traditional DP)
  2. Interface to Communication and Storage
  3. Input and output interfaces (function specific)

AIMs can implement auto-configuration or reconfiguration of their ML-based computational models.

  1. Communication

Communication is required in several cases and can be implemented accordingly, e.g. by means of a service bus. Components can communicate among themselves and with outputs and Storage.

The Management and Control API implements one- and two-way signalling for computational workflow initialisation and control.

  1. Storage

Storage encompasses traditional storage and is referred to a variety of data types, e.g.:

  1. Inputs and outputs of the individual AIMs
  2. Data from the AIM’s state, e.g. with respect to traditional and continuous learning
  3. Data from the AIM’s intermediary results
  4. Shared data among AIMs
  5. Information used by Management and Control.
  6. Access

Access represents the access to static or slowly changing data that are required by the application such as domain knowledge data, data models, etc.

4        Requirements

4.1       Component requirements

  1. The MPAI-AIF standard shall include specifications of the interfaces of 6 Components
    1. Management and Control
    2. Execution
    3. AI Modules (AIM)
    4. Communication
    5. Storage
    6. Access
  2. MPAI-AIF shall support configurations where Components are distributed in the cloud and at the edge
  3. Management and Control shall enable operations on the general Machine Learning and/or traditional Data Processing life cycle of
    1. Single AIMs, e.g. instantiation-configuration-removal, internal state dumping/retrieval, start-suspend-stop, train-retrain-update, enforcement of resource limits
    2. Combinations of AIMs, e.g. initialisation of the overall computational model, instan­tiation-removal-configuration of AIMs, manual, automatic, dynamic and adaptive configuration of interfaces with Components.
  4. Management and Control shall support
    1. Architectures that allow application-scenario dependent hierarchical execution of workflows, i.e. a combination of AIMs into computational graphs
    2. Supervised, unsupervised and reinforcement-based learning paradigms
    3. Computational graphs, such as Direct Acyclic Graph (DAG) as a minimum
    4. Initialisation of signalling patterns, communication and security policies between AIMs
  5. Storage shall support protocols to specify application-dependent requirements such as access time, retention, read/write throughput
  6. Access shall provide
    1. Static or slowly changing data with standard formats
    2. Data with proprietary formats

4.2       Systems requirements

The following requirements are not intended to apply to the MPAI-AIF standard, but should be used for assessing technologies

  1. Management and Control shall support asynchronous and time-based synchronous operation depending on application
  2. The Architecture shall support dynamic update of the ML models with seamless or minimal impact on its operation
  3. ML-based AIMs shall support time sharing operation enabling use of the same ML-based AIM in multiple concurrent applications
  4. AIMs may be aggregations of AIMs exposing new interfaces
  5. Complexity and performance shall be scalable to cope with different scenarios, e.g. from small MCUs to complex distributed systems
  6. The Architecture shall support workflows of a mixture of AI/ML-based and DP technology-based AIMs.

4.3       General requirements

The MPAI-AIF standard may include profiles for specific (sets of) requirements

5        Conclusions

When the definition of the MPAI-AIF Framework Licence will be completed, MPAI will issue a Call for Technologies that support the AI Framework with the requirem­ents given in this document.

Respondents will be requested to state in their submissions their intention to adhere to the Frame­work Licence developed for MPAI-AIF when licensing their technologies if they have been inc­luded in the MPAI-AIF standard.

The MPAI-AIF Framework Licence will be developed, as for all other MPAI Framework Licences, in compliance with the gener­ally accepted principles of competition law.

6        References

[1] MPAI Application Note#4 – MPAI-AIF Artificial Intelligence Framework

[2] MPAI Application Note#1 R1 – MPAI-CAE Context-based Audio Enhancement

[3] MPAI Application Note#2 R1 – MPAI-GSA Integrative Genomic/Sensor Analysis

[4] MPAI Application Note#3 R1 – MPAI-EVC AI-Enhanced Video Coding

[5] MPAI Application Note#5 R1 – MPAI-SPG Server-based Predictive Multiplayer Gaming

[6] MPAI Application Note#6 R1 – MPAI-MMC Multi-Modal Conversation

[7] MPAI-CAE Functional Requirements work programme

[8] MPAI-GSA Functional Requirements work programme

[9] MPAI-MMC Functional Requirements work programme

[10] MPAI-EVC Use Cases and Requirements

[11] Collaborative Evidence Conditions for MPAI-EVC Evidence Project R1

[12] Operational Guidelines for MPAI-EVC Evidence Project


MPAI-AIF V2 Call for Technologies

The MPAI-AIF  V2 Call for Technologies is also available as a Word document

1       Introduction. 1

2       How to submit a response. 3

3       Evaluation Criteria and Procedure. 4

4       Expected development timeline. 4

5       References. 4

Annex A: Information Form.. 5

Annex B: Evaluation Sheet 6

Annex C: Requirements check list 9

Annex D: APIs that may require specific testing. 10

Annex E: Mandatory text in responses. 11

1        Introduction

Moving Picture, Audio and Data Coding by Artificial Intelligence (MPAI) is an international non-profit organisation with the mission of developing standards for Artificial Intelligence (AI)-enabled digital data coding and technologies that facilitate the integration of data coding components into ICT systems [1]. With the mechanism of Framework Licences, MPAI intends to facilitate the creation of a patent pool that relies on the clear IPR licensing frameworks established by the Framework Licences [2].

AI Framework Version 1 (MPAI-AIF V1) is an MPAI System-oriented standard specifying an environment able to execute AI applications called AI Workflows (AIW) composed of AI Modules (AIM).

MPAI Application Standards normatively specify the Function of the AIW and its AIMs, the Syntax and Semantics of the input and output Data of the AIW and its AIMs, and the Connections between and among the AIMs of an AIW. MPAI Application Standards do not specify the internal architecture of the AIMs, which may be based on AI or data processing technologies, and be implemented in software, hardware or mixed software and hardware technologies.

MPAI-AIF V1 assumes that the execution environment is in a Trusted Zone. However, it does not specify how an environment with the required attributes can be realised. Therefore, MPAI has developed a set of Functional Requirements whose satisfaction can support the developer with specific security requirements.

This MPAI-AIF V2 Call for Technologies (CfT) invites any party owning technologies that satisfy the MPAI-AIF V2 Use Cases and Functional Requirements [5] and are willing to release their technologies to any third party according to the MPAI-AIF V2 Framework Licence [6], if selected by MPAI and inclusion in the MPAI-AIF V2 standard with possible modifications. Any respondent party who is not an MPAI member and has their technologies accepted shall join MPAI or lose the opportunity to have their technologies included.

The MPAI-AIF V2 Technical Specification will be developed using technologies that are:

  1. Part of an already published MPAI standard, or
  2. Satisfy the following mandatory requirements:
    1. To be part of responses to this Call containing acceptance of the MPAI-AIF V2 Framework Licence [6].
    2. To satisfy the MPAI-AIF V2 Use Cases and Functional Requirements [5]. In the future, MPAI may decide to further extend MPAI-AIF to support new functionalities as a part of this MPAI-AIF V2 or as a future extension of it.
    3. To use, where feasible and desirable, the technologies in the MPAI-AIF V1.1 standard [4] that satisfy MPAI-AIF V2 Use Cases and Functional Requirements [5] or the technologies specified in other relevant MPAI standards published in [1].

Therefore, the scope of this Call for Technologies includes technologies satisfying the requirements identified in [5].

However, respondents are welcome to additionally do one or more of the following:

  1. Make comments on any technology or architectural component identified in [5].
  2. Propose to:
  3. Add or remove input/output signals to the identified AIMs:
    1. Justifying the changes.
    2. Identifying the data formats of the new input/output data.
  1. Partition the AIMs in an AIW implementing the cases providing:
  1. Arguments in support of the proposed partitioning.
  2. Detailed specifications of the input and output data of the proposed new AIMs.
  1. New fully described Use Cases as done in [5].
  1. In general, submit motivated proposals of technologies not included in [5], for inclusion in the MPAI-AIF V2 standard MPAI if they satisfy the Framework Licence [6].

All parties who believe they have relevant technologies satisfying all or most of the requirements of [5] are invited to submit proposals for consideration by MPAI. MPAI membership is not a prerequisite for responding to this CfT. However, proponents should be aware that, if their proposal or part thereof is accepted for inclusion in the MPAI-AIF V2 standard, they will be requested to immediately join MPAI, or lose the opportunity to have their accepted technologies included in the standard.

MPAI will select the most suitable technologies based on their technical merits for inclusion in the MPAI-AIF V2 standard after possible adaptation/modification. However, MPAI in not obligated, by virtue of this CfT, to select a particular technology or to select any of the proposed technologies if those submitted are found inadequate.

Submissions shall be received to the MPAI secretariat (secretariat@mpai.community) by 2021/10/24 T23:59 UTC. The secretariat will acknowledge receipt of the submission via email. Submissions will be reviewed according to the schedule that the 25th MPAI General Assembly (MPAI-25) will define at its online meeting on 2021/10/26. For details on how non MPAI members who have made a submission can attend the said review sessions should contact the MPAI secretariat (secretariat@mpai.community).

2        How to submit a response

Those planning to respond to this CfT are:

  1. Advised that online event has been held on 2022/07/11 at 15:00 UTC where the MPAI-AIF V2 Functional Requirements were presented. The recording of the presentation is available [7].
  2. Requested to communicate their intention to respond to this MPAI-AIF V2 CfT with an initial version of the form of Annex A to the MPAI secretariat (secretariat@mpai.community) by 2021/09/13. Submission of Annex A helps MPAI properly plan for the revision of submissions. However, those who have submitted an Annex A are not requested to make a submission and those who have not submitted Annex A are not precluded from making a submission.
  3. Encouraged to visit regularly the Call for Technologies web page where additional relevant information will be posted.

Responses to this MPAI-AIF V2 CfT may or shall include:

Table 1 – Optional and mandatory elements of a response

Item Status
Detailed documentation describing the proposed technologies mandatory
The final version of Annex A mandatory
The text of Annex B duly filled out with the table indicating which Functional Requirements identified in MPAI N768 [5] are satisfied. If some of the Functional Requirements of a Use Case are not satisfied, this should be explained. mandatory
Comments on the completeness and appropriateness of the MPAI-AIF V2 Functional Requirements and any motivated suggestion to amend and/or extend those Requirements. optional
A preliminary demonstration, with a detailed document describing it. optional
Any other additional relevant information that may help evaluate the submission. optional
The text of Annex E. mandatory

Respondents are invited to take advantage of the check list of Annex C before submitting their response and filling out Annex A.

Respondents are mandatorily requested to present their submission at a teleconference meeting to be properly announced to submitters by the MPAI Secretariat. If no presenter of a submission will be attending that meeting, the submission will be discarded.

Respondents are advised that, upon acceptance by MPAI of their submission in whole or in part for further evaluation, submitters shall:

  • Make available a working implementation, including source code – for use in the development of the MPAI-AIF V2 Reference Software and successive publication as an MPAI-AIF V2 Reference Software Implementation – before the technology is accepted for inclusion in the MPAI-AIF V2 standard. Software may be written in programming languages that can be compiled or interpreted. Hardware Description Language implementations are also accepted.
  • Immediately join MPAI if non-MPAI member. If the non-MPAI member elects not to join, their submission will be discarded. Direction on how to join MPAI can be found online.

Further information on MPAI can be obtained from the MPAI website.

3        Evaluation Criteria and Procedure

Proposals will be assessed using the following process:

  1. Evaluation panel is created from:
    1. AIF-DC members in attendance.
    2. Non-MPAI members who are respondents.
    3. Non respondents/non MPAI member experts invited in a consulting capacity.
  2. No one from 1.1.-1.2. is denied membership in the Evaluation panel.
  3. Respondents present their proposals.
  4. Evaluation Panel members ask questions.
  5. If required subjective and/or objective tests are carried out after:
    1. The required tests are defined.
    2. The required tests are carried out.
    3. A report is produced.
  6. If required, at least 2 reviewers are appointed to review and report about specific points in a proposal.
  7. Evaluation panel members fill out Annex B for each proposal.
  8. Respondents respond to evaluations.
  9. Proposal evaluation report is produced.

4        Expected development timeline

Timeline of the CfT, deadlines and response evaluation:

Table 2 – Dates and deadlines

Step Date Time
Online presentation of MPAI-AIF V2 2022/07/11 15:00 UTC
Call for Technologies 2022/07/19 17:00 UTC
Notification of intention to submit proposal 2022/09/13 23.59 UTC
Submission deadline 2022/10/24 23.59 UTC
Start of response evaluation 2022/10/26 (MPAI-25) 14:00 UTC
MPAI-AIF V2 publication Spring 2023

Evaluation will be carried out during 2-hour sessions according to the calendar agreed at the time of MPAI-25.

5        References

  1. MPAI Standards Resources; https://mpai.community/standards/resources/.
  2. MPAI Patent Policy; https://mpai.community/about/the-mpai-patent-policy/.
  3. Governance of the MPAI Ecosystem (MPAI-GME); https://mpai.community/standards/resources/#GME.
  4. AI Framework (MPAI-AIF) V1.1; https://mpai.community/standards/resources/#AIF
  5. MPAI-AIF V2 Use Cases and Functional Requirements; https://mpai.community/standards/mpai-aif/use-cases-and-functional-requirements/mpai-aif-v2-use-cases-and-functional-requirements/.
  6. MPAI-AIF V2 Framework Licence; https://mpai.community/standards/mpai-aif/framework-licence/.
  7. Presentation of MPAI-AIF V2 Use Cases and Functional Requirements; https://platform.wim.tv/#/webtv/convenor/vod/0b55db63-3ef9-4e69-ab02-b08b5a6dec7c.

Annex A: Information Form

This information form is to be filled in by a Respondent to this MPAI-AIF V2 Call for Technologies.

  1. Title of the proposal
  2. Organisation: company name, position, e-mail of contact person
  3. What are the main functionalities of your proposal?
  4. Does your proposal provide or describe a formal specification and APIs?
  5. Will you provide a demonstration to show how your proposal meets the evaluation criteria?

Annex B: Evaluation Sheet

NB: This evaluation sheet will be filled out by members of the Evaluation Team.

Proposal title:

Main functionalities:

 Response summary: (a few lines)

Comments on relevance to the CfT (Requirements):

Comments on possible MPAI-AIF profiles[1]

Evaluation table:

Table 3Assessment of submission features

Note 1 The semantics of submission features is provided by Table 4
Note 2 Evaluation elements indicate the elements used by the evaluator in assessing the submission
Note 3 Final Assessment indicates the ultimate assessment based on the Evaluation Elements

 

Submission features Evaluation elements Final Assessment
Completeness of description

Understandability

Extensibility

Use of Standard Technology

Efficiency

Test cases

Maturity of reference implementation

Relative complexity

Support of MPAI use cases

Support of non-MPAI use cases

Content of the criteria table cells:

Evaluation facts should mention:

  • Not supported / partially supported / fully supported.
  • What supports these facts: submission/presentation/demo.
  • The summary of the facts themselves, e.g., very good in one way, but weak in another.

Final assessment should mention:

  • Possibilities to improve or add to the proposal, e.g., any missing or weak features.
  • How sure the evaluators are, i.e., evidence shown, very likely, very hard to tell, etc.
  • Global evaluation (Not Applicable/ –/ – / + / ++)

 New Use Cases/Requirements Identified:

(Please describe)

  •  Evaluation summary:
  •  Main strong points, qualitatively:
  •  Main weak points, qualitatively:
  • Overall evaluation: (0/1/2/3/4/5)

0: could not be evaluated

1: proposal is not relevant

2: proposal is relevant, but requires significant more work

3: proposal is relevant, but with a few changes

4: proposal has some very good points, so it is a good candidate for standard

5: proposal is superior in its category, very strongly recommended for inclusion in standard

Additional remarks: (points of importance not covered above.)

The submission features in Table 3 are explained in the following Table 4.

Table 4 – Explanation of submission features

Submission features Criteria
Completeness of description Evaluators should

1.     Compare the list of requirements (Annex C of the CfT) with the submission.

2.     Check if respondents have described in sufficient detail to what part of the requirements their proposal refers to.

NB1: Completeness of a proposal for a Use Case is a merit because reviewers can assess how the components are integrated.

NB2: Submissions will be judged for the merit of what is proposed. A submission on a single technology that is excellent may be considered instead of a submission that is complete but has a less performing technology.

Understandability Evaluators should identify items that are demonstrably unclear (incon­sistencies, sentences with dubious meaning etc.)
Extensibility Evaluators should check if respondent has proposed extensions.

NB: Extensibility is the capability of the proposed solution to support functionalities that are not supported by current requirements.

Use of standard Technology Evaluators should check if new technologies are proposed where widely adopted technologies exist. If this is the case, the merit of the new technology shall be proved.
Efficiency Evaluators should assess power consumption, computational speed, computational complexity.
Test cases Evaluators should report whether a proposal contains suggestions for testing the technologies proposed.
Maturity of reference implementation Evaluators should assess the maturity of the proposal.

Note 1: Maturity is measured by the completeness, i.e., having all the necessary information and appropriate parts of the HW/SW implementation of the submission disclosed.

Note 2: If there are parts of the implementation that are not disclosed but demonstrated, they will be considered if and only if such components are replicable.

Relative complexity Evaluators should identify issues that would make it difficult to implement the proposal compared to the state of the art.
Support of MPAI-AIF use cases Evaluators should check how many use cases are supported in the submission
Support of non MPAI-AIF use cases Evaluators should check whether the technologies proposed can demonstrably be used in other significantly different use cases.

Annex C: Requirements check list

Table 5 – List of technologies in MPAI-AIF Use Cases and Functional Requirements [5]

Requirements Response
1.     The AIF Components shall access high-level implementation-independent Trusted Services API to handle:
a.     Encryption Service. Y/N
b.     Attestation Service. Y/N
c.     Trusted Communication Service. Y/N
d.     Trusted AIM Storage Service including the following functional­ities: Y/N
                                      i.     Initialisation (secure and non-secure flash and RAM) Y/N
                                    ii.     Read/Write. Y/N
                                  iii.     De-initialisation. Y/N
e.     Trusted AIM Model Services including the following functional­ities: Y/N
                                      i.     Secure and non-secure Model Storage. Y/N
                                    ii.     Model Update. Y/N
                                  iii.     Model Validation. Y/N
f.      AIM Security Engine including the following functionalities: Y/N
                                      i.     Model Encryption. Y/N
                                    ii.     Model Signature. Y/N
                                  iii.     Model Watermarking. Y/N
2.     The AIF Components shall be easily integrated with the above Services. Y/N
3.     The AIF Trusted Services shall be able to use hardware and OS security features already existing in the hardware and software of the environment in which the AIF is implemented. Y/N
4.     Application developers shall be able to select the application’s security either or both by:
a.     Level of security that includes a defined set of security features for each level. Y/N
b.     Developer-defined security, i.e., a level that includes a developer-defined set of security features. Y/N
5.     The specification of the AIF V2 Metadata shall be an extension of the AIF V1 Metadata supporting security with either or both standardised and developer-defined levels. Y/N
6.     Submission of use cases and their respective threat models. Y/N

Annex D: APIs that may require specific testing

Table 6 will be compile based on the responses received.

Table 6 – APIs that may require specific testing

Section APIs Nature of Test

Annex E: Mandatory text in responses

A response to this MPAI-AIF CfT shall mandatorily include the following text

<Company/Member> submits this technical document in response to  MPAI Call for Technologies for AI Framework Version 2 (MPAI-AIF V2) (N769).

 <Company/Member> explicitly agrees to the steps of the MPAI standards development process defined in Annex 1 to the MPAI Statutes (N421), in particular <Company/Member> declares that  <Com­pany/Member> or its successors will make available the terms of the Licence related to its Essential Patents according to the Framework Licence of MPAI-AIF V2 (N799), alone or jointly with other IPR holders after the approval of the MPAI-AIF Technical Specification Version 2 by the General Assembly and in no event after commercial implementations of the MPAI-AIF V2 Technical Specification become available on the market.

In case the respondent is a non-MPAI member, the submission shall mandatorily include the following text

If (a part of) this submission is identified for inclusion in a specification, <Company>  understands that  <Company> will be requested to immediately join MPAI and that, if  <Company> elects not to join MPAI, this submission will be discarded.

Subsequent technical contribution shall mandatorily include this text

<Member> submits this document to MPAI-AIF Development Committee (AIF-DC) as a con­tribution to the development of the MPAI-AIF Technical Specification.

 <Member> explicitly agrees to the steps of the MPAI standards development process defined in Annex 1 to the MPAI Statutes (N421), in particular  <Company> declares that <Company> or its successors will make available the terms of the Licence related to its Essential Patents according to the MPAI-AIF V2 Framework Licence (N799), alone or jointly with other IPR holders after the approval of the MPAI-AIF Technical Specification by the General Assembly and in no event after commercial implementations of the MPAI-AIF Technical Specification become available on the market.

 

[1] Profile of a standard is a particular subset of the technologies that are used in a standard and, where applicable, the classes, subsets, options and parameters relevan for the subset.

 


MPAI-AIF V1 Call for Technologies

1 Introduction

Moving Picture, Audio and Data Coding by Artificial Intelligence (MPAI) is an international non-profit organisation with the mission to develop Artificial Intelligence (AI) enabled digital data coding standards, especially using new technologies such as Artificial Intelligence, and of technologies that facilitate integration of data coding components into ICT systems. With the mechanism of Framework Licences, MPAI seeks to attach clear IPR licensing frameworks to its standards.

As a result of the analysis of several use cases, MPAI has identified the need for a common AI Framework that can support the implementation of Use Cases. MPAI expects that most future use cases will benefit from the use of the MPAI AI Framework or extensions thereof. For this reason, MPAI has decided that a standard satisfying the requirements contained in MPAI document N74 available online would benefit use case implementors.

This document is a Call for Technologies (CfT) that 1) satisfy the requirements of N74 and 2) are released according to the Framework Licence of N101, if selected by MPAI for inclusion in the MPAI AI Framework standard called MPAI-AIF. MPAI will select the most suitable technologies on the basis of their technical merits for inclusion in MPAI-AIF.

All parties who believe they have relevant technologies satisfying all or most of the requirements mentioned in MPAI N74 are invited to submit proposals for consideration by MPAI. The parties do not necessarily have to be MPAI members.

MPAI in not obligated, by virtue of this CfT, to select a particular technology or to select any technology if those submitted are found inadequate.

Submissions are due on 2020/02/15T23:59 UTC and will be reviewed according to the schedule that the 5th MPAI General Assembly (MPAI-5) will define at its online meeting on 2021/02/17. Non-MPAI members should contact the MPAI secretariat (secretariat@mpai.community) for further details on how they can attend the said review.

2 How to submit a response

Those planning to respond to this CfT

  1. Are advised that online events will be held on 2020/12/21 and 2021/01/07 to present the MPAI-AIF CfT and respond to questions. Logistic information of these events will be posted on the MPAI web site
  2. Are requested to communicate their intention to respond to this CfT with an initial version of the form of Annex A to the MPAI secretariat (secretariat@mpai.community) by 2021/01/15. A potential submitter making a communication using the said form is not required to actually make a submission. Submission will be accepted even if the submitter did not communicate their intention to submit a response.

Responses to this MPAI-AIF CfT shall/may include:

Item Status
Detailed documentation describing the proposed technologies mandatory
The final version of Annex A mandatory
The text of Annex B duly filled out with the table indicating which requirements identified in MPAI N74 are satisfied. If a requirement is not satisfied, the submission shall indicate the reason mandatory
Comments on the completeness and appropriateness of the MPAI-AIF requirem­ents and any motivated suggestion to extend those requirements optional
A preliminary demonstration, with a detailed document describing it optional
Any other additional relevant information that may help evaluate the submission, such as additional use cases optional
The text of Annex D mandatory

Respondents are invited to review the check list of Annex C before submitting their response and filling out Annex B.

Responses to this MPAI-AIF CfT shall be submitted to secretariat@mpai.community (MPAI secretariat) by 2020/02/15T23:59 UTC. The secretariat will acknowledge receipt of the submission via email.

Respondents to this CfT are requested to present their submission (mandatory). If no presenter will attend the meeting, the proposal will be discarded.

Respondents are advised that, upon acceptance by MPAI for further evaluation of their submission in whole or in part, MPAI will require that

  • A working implementation, including source code, – for use in the development of the MPAI-AIF Reference Software – be made available before the technology is accepted for the MPAI-AIF standard. Software may be written in programming languages that can be compiled or interpreted and in hardware description languages.
  • A non-MPAI member immediately join MPAI. If the non-MPAI member elects not to do so, their submission will be discarded. Direction on how to join MPAI can be found online.

Further information on MPAI can be obtained from the MPAI website.

3 Evaluation Criteria and Procedure

Submissions will be evaluated on the basis of the criteria identified in Annex B and with the following steps:

1) Presentation (mandatory) / Demonstration (optional)

Goal To assess the submission based on a presentation and possible demonstration that 

1.     Demonstrate the appropriateness and disclose the appropriate range of use.

2.     Provide evidence of the functionalities claimed, and of how the submission satisfies the evaluation criteria.

NB1: A respondent may opt to select a particular use case to demonstrate their functionalities. MPAI encourages to select one of the existing Use Cases. A respondent my demonstrate a new use case. However, they should provide complete description of the use case, of the inputs and outputs of the implemented AIMs and the interaction between AIMs and Management and Control.

NB2: Both demo and presentation will each have a time limit (to be determined).

Output Complete proposal evaluation sheet in Annex B.

2) Produce a conclusion

Goal To summarise the results. This should enable MPAI to identify 

·       The strong points of the proposal.

·       How the proposal might be adapted or combined with other proposals to enter the Working Draft, and/or be further tested.

Output  Proposed evaluation results.

4 Expected development timeline

Timeline of the call, deadlines and evaluation of the answers:

Call for Technologies 2020/12/16
Conference Calls 2020/12/21 and 2021/01/07
Notification of intention to submit a proposal 2021/01/15
Submission deadline 2021/02/15T23.59 UTC
Evaluation of responses Calendar determined at MPAI-5 2021/02/17

Evaluation to be carried out during 2-hour sessions according to the calendar agrees at MPAI-5

5 References

[1] Use Cases & Functional Requirements of MPAI-AIF, MPAI N74; https://mpai.community/standards/mpai-aif/

[2] Use Case-Requirements-candidate technologies for MPAI-CAE CfT, MPAI N96

[3] Use Case-Requirements-candidate technologies for MPAI-MMC CfT, MPAI N97

[4] MPAI-CUI Use Cases and Functional Requirements, MPAI N95

Annex A: Information Form

This information form is to be filled in by a respondent to the MPAI-AIF CfT

  1. Title of the proposal
  1. Organisation: company name, position, e-mail of contact person
  1. What is the main functionalities of your proposal?
  1. Does your proposal provide or describe a formal specification and APIs?
  1. Will you provide a demonstration to show how your proposal meets the evaluation criteria?

Annex B: Evaluation Sheet

This evaluation sheet is to be used for self-evaluation in the submission and to be filled out during evaluation phase.

Title of the Proposal:

Main Functionalities:

 Summary of Response: (a few lines)

Comments on Relevance to the CfT (Requirements):

Evaluation table:

Submission features Evaluation elements Final Assement
Completeness of description

Understandability

Adaptability

Extensibility

Use of Standard Technology

Efficiency

Test cases

Maturity of reference implementation

Relative complexity

Support of MPAI use cases

Support of non-MPAI use cases

Content of the criteria table cells:

Evaluation facts should mention:

  • Not supported / partially supported / fully supported.
  • What supported these facts: submission/presentation/demo.
  • The summary of the facts themselves, e.g., very good in one way, but weak in another.

Final assessment should mention:

  • Possibilities of improving or adding to the proposal, e.g., any missing or weak features.
  • How sure the experts are, i.e., evidence shown, very likely, very hard to tell, etc.
  • Global evaluation (Not Applicable/ –/ – / + / ++)

 New Use Cases/Requirements Identified:

Summary of the evaluation:

  • Main strong points, qualitatively: 
  • Main weak points, qualitatively:
  • Overall evaluation: (0/1/2/3/4/5)

0: could not be evaluated

1: proposal is not relevant

2: proposal is relevant, but requires much more work

3: proposal is relevant, but with a few changes

4: proposal has some very good points, so it is a good candidate for standard

5: proposal is superior in its category, very strongly recommended for inclusion in standard

Additional remarks: (points of importance not covered above.)

Annex C: Requirements check list

This list has been derived from the Requirements of N74. It is not intended to be a replacement of those Requirements.

The submission shall support the following requirements

  1. General Machine Learning and/or Data Processing life cycles with the possibility to
    1. instantiate-configure-remove
    2. dump/retrieve internal state
    3. start-suspend-stop
    4. train-retrain-update
    5. enforce resource limits
    6. implement auto-configuration/reconfiguration of ML-based computational models of

single AIMs and

  1. initialise the overall computational model
  2. instantiate-remove-configure AIMs
  3. manually, automatically, dynamically and adaptively configure interfaces with Com­ponents
  4. one- and two-way signal for computational workflow initialisation and control of

combinations of AIMs

  1. Application-scenario dependent hierarchical execution of workflows
  2. Topology of networked AIMs that can be synchronised according to a given time base and full ML life cycles
  3. Supervised, unsupervised and reinforcement-based learning paradigms
  4. Computational graphs, such as Direct Acyclic Graph (DAG) as a minimum
  5. Initialisation of signalling patterns, communication and security policies between AIMs
  6. Protocols to specify storage access time, retention, read/write throughput etc.
  7. Storage of Components’ data
  8. Access to
    1. Static or slowly changing data with standard formats
    2. Data with proprietary formats

The submission shall support the implementation of AI Frameworks featuring

  1. Asynchronous and time-based synchronous operation depending on application
  2. Dynamic update of the ML models with seamless or minimal impact on its operation
  3. Time-sharing operation of ML-based AIMs shall to enabl use of the same ML-based AIM in multiple concurrent applications
  4. AIMs which are aggregations of AIMs exposing new interfaces
  5. Workflows that are a mixture of AI/ML-based and DP technology-based AIMs.
  6. Scalability of complexity and performance to cope with different scenarios, e.g. from small MCUs to complex distributed systems

The submission shall not inhibit the creation of MPAI-AIF profiles.

Annex D: Mandatory text in responses

A response to this MPAI-AIF CfT shall mandatorily include the following text

<Company/Member> submits this technical document in response to MPAI Call for Technologies for MPAI project MPAI-AIF (MPAI document N100).

 <Company/Member> explicitly agrees to the steps of the MPAI standards development process defined in Annex 1 to the MPAI Statutes, in particular <Company/Member> declares that  <Com­pany/Member> or its successors will make available the terms of the Licence related to its Essential Patents according to the Framework Licence of MPAI-AIF (MPAI document N101), alone or jointly with other IPR holders after the approval of the MPAI-AIF Technical Specif­ication by the General Assembly and in no event after commercial implementations of the MPAI-AIF Technical Specification become available on the market.

In case the respondent is a non-MPAI member, the submission shall mandatorily include the following text

If (a part of) this submission is identified for inclusion in a specification, <Company>  understands that  <Company> will be requested to immediately join MPAI and that, if  <Company> elects not to join MPAI, this submission will be discarded.

Subsequent technical contribution shall mandatorily include this text

<Member> submits this document to MPAI Development Committee AIF as a contribution to the development of the MPAI-AIF Technical Specification.

 <Member> explicitly agrees to the steps of the MPAI standards development process defined in Annex 1 to the MPAI Statutes, in particular  <Company> declares that <Company> or its successors will make available the terms of the Licence related to its Essential Patents according to the Framework Licence of MPAI-AIF (MPAI document N101), alone or jointly with other IPR holders after the approval of the MPAI-AIF Technical Specification by the General Assembly and in no event after commercial implementations of the MPAI-AIF Technical Specification become available on the market.


MPAI-AIF AI Framework Announcement

By the 15th of January 2021, those intending to submit a response to the MPAI-AIF Call for Technologies (CfT) should send secretariat@mpai.community an email containing the following data (Annex A to the CfT)

  1. Title of the proposal
  1. Organisation: company name, position, e-mail of contact person
  1. What is the main functionalities of your proposal?
  1. Does your proposal provide or describe a formal specification and APIs?
  1. Will you provide a demonstration to show how your proposal meets the evaluation criteria?

Your response, but not your identity, will be posted to this web page. While this is a competitive CfT, we wish to give as much information as possible about how well the CfT Functional Requirements are covered by responses.


Framework Licence

Artificial Intelligence Framework (MPAI-AIF)

Principal Members have developed the MPAI-AIF V2 Framework Licence providing a set of commercial requirements not containing critical data such as values in dollars, percentages, rates, dates etc. Relevant to MPAI-AIF V2 are:

  1. Call for Technologies. The call is open until 2022/10/10.
  2. Use Cases and Functional Requirements. Describes the use cases, the technologies, and their functional requirements.

Principal Members had developed the MPAI-AIF V1 Framework Licence providing a set of commercial requirements not containing critical data such as values in dollars, percentages, rates, dates etc. Relevant to MPAI-AIF V1 were:

  1. Call for Technologies. The call is closed.
  2. Use Cases and Functional Requirements described the use cases the technologies and their functional requirements.

The MPAI-AIF V1 Framework Licence is being used by the Patent Pool to develop the licence.