<-References      Go to ToC      Ecosystem->

1 Introduction 2 Technical Specifications 3 Reference Software Specifications
4  Conformance Testing Specifications 5 Performance Assessment Specifications 6 Technical Report

1    Introduction

The following document types are developed, approved, published, and maintained by MPAI for each of its Standards enabling the MPAI Ecosystem:

  1. Technical Specification.
  2. Reference Software Specification.
  3. Conformance Testing Specification.
  4. Performance Assessment Specification.

An MPAI Standard is the collection of the 4 document types that includes at least the first type of document and may additionally include Technical Reports.

2       Technical Specifications

2.1 Types of Technical Specifications

These are of three types of Standards:

  1. Management-oriented: address issues regarding the management of Technical Documents, such as this Technical Specification.
  2. System-oriented: address the context in which Application-Oriented Technical Specifications are handled and executed, such as:
    1. AI Framework
    2. Profiles
    3. Qualifiers
  3. Application-oriented: are container standards specifying the components of AI Systems whose Implementations are part of the MPAI Ecosystem. An important case is given by Technical Specifications based on the AI Framework.

Application-oriented Technical Specifications include the following chapters:

  1. Normative chapters:
    1. Scope
    2. Definitions
    3. References
    4. AI Workflows
    5. AI Modules
    6. Data Types
  2. Informative chapters:
    1. Foreword
    2. Introduction

Some application-oriented Technical Specifications may not include specification of AI Workflows or AI Modules.

2.2 Development Process

Technical Specification shall include at least the following stages:

  1. Development of Use Cases and Functional Requirements.
  2. Development of the Technical Specification- and possibly Version-related Framework Licence.
  3. Publication of Call for Technologies.
  4. Development of Technical Specification.
  5. Approval and Publication.

The MPAI Patent Policy details the complete process.

A new Project that is intended to generate a Technical Specification or a Technical Report is assigned a three character acronym. The name of the Project includes a not-too-large number of words

2.3 Meaning of the word “Normative”

The word normative implies that an Implementation claiming Conformance to:

  1. An AIW, shall:
    1. Perform the function specified in the relevant section of the AI Workflows chapter.
    2. Use AIMs connected with the topology and connections conforming with the relevant AIM specifications.
    3. Receive input and produce output data having the formats specified by the relevant Data Type specifications.
  2. An AIM:
    1. Shall perform the function specified in the relevant the relevant section of the AI Workflows chapter of the relevant Technical Specification.
    2. May be composed of Sub-AIMs in accordance with the relevant AIM specification when it is a Composite AIM.
    3. Receive and produce data in accordance with the relevant Data Types specifications.

2.4 Naming

The name of a Technical Specification may be:

  1. MPAI-ABC where ABC is the three-character acronym given to the Project.
  2. ABC-XYZ where XYZ if the three-character acronym indicating the name of an area of the ABC project.

2.5 Publication

It is recommended that, before final publication, a sufficiently mature draft Technical Specification be published for Community Comments on the MPAI website. The Secretary collects comments received before the deadline and forwards it to the relevant Development Committee or Group of the  Requirements Standing Committee.

The Technical Specification shall be published on the relevant project page as a pdf file and as connected web pages each including a chapter of the standard.

The cover page or website shall make reference to Notices and Disclaimers.

The Reference Software, Conformance Testing, and Performance Assessment Specification may be published either as an independent document or as part of the relevant Technical Specification.

Per the Procedures of Work of the MPAI Statutes, the Secretariat shall collect the Patent Declarations submitted by Members.

3       Reference Software Specifications

The Reference Software Specification of a Technical Specification is a Document that provides basic information about, and references, the Reference Software Implementation.

The Reference Software Implementation of an AIW or AIM (in the following “Software”) shall display the functionality specified in the relevant Technical Specification and produce Conforming output data when fed with input data that Conform with the relevant Technical Specification. Data is defined as Conforming when it Conforms with the relevant specification in the relevant Conformance Testing Specification.

Software is made available in one or more than one of the following forms:

  1. As source or compiled code providing a user experience and/or functionality sufficient to assess the value of the standard.
  2. As source code or compiled wrapping access to a third-party service enabling a conforming AIM Implementation (Wrapper AIM).

General notes about Software release:

  1. The Software shall be published with the following disclaimers:
    1. The MPAI-ABC or ABC-XYZ Reference Software Implementation, if in source code, is released with the BSD-3-Clause licence.
    2. The purpose of this Reference Software is to provide a working Implementation of MPAI-ABC or ABC-XYZ, not to provide a ready-to-use product.
    3. MPAI disclaims the suitability of the Software for any other purposes and does not guarantee that it is secure.
    4. Use of this Reference Software may require acceptance of licences from the respective copyright holders. Users shall verify that they have the right to use any third-party software required by this Reference Software.
  2. If the Software is compiled code, it may not be used in commercial products or services unless the rights holder agrees otherwise.
  3. Sample input data or a data generating environment or endpoint for trialling the reference software shall also provided if required to operate the Software.
  4. If the Software requires use of a knowledge base, access to a knowledge base conforming with the standard shall be provided.
  5. The Software need not pass any clause of the Performance Assessment Specification because its purpose is only to provide an example of a technically correct Implementation.

A Wrapper AIM is governed by the followings clauses:

  1. A Wrapper AIM should be accompanied by detailed descriptions and references to any Third-Party Service documentation.
  2. The Submitter commits to maintain the Wrapper AIM for 12 months after submission.
  3. If the Third-Party Service is discontinued, the Submitter shall make its best effort to find a similar Third-Party Service and develop a new Wrapper AIM .
  4. After 12 months, the submitter is not required to maintain the Wrapper AIM.

The Reference Software Specification is usually integrated with the Technical Specification. When it is published as a stand-alone document it shall include:

  1. Normative chapters:
    1. Scope.
    2. Definitions.
    3. References.
    4. Architecture and Operation of the Reference Software Implementation.
  2. Informative chapters
    1. Foreword.
    2. Introduction.

4       Conformance Testing Specifications

A Conformance Testing Specification allows a user to ascertain whether an implementation is a correct embodiment of a Technical Specification. i.e.:

  1. If it is an AIW, it can technically replace an equivalent Conforming AIW, i.e. the AIW offers the functionality specified for the AIW and provides Conforming output data when fed with Conforming input data.
  2. If it is an AIM, it can technically replace an equivalent Conforming AIM, i.e.:
    1. It offers the functionality specified for the AIM.
    2. Its outputs each conform to the Data Type specified in the relevant standard.
  3. If it is Data, it conforms to the Data Type specified in the relevant standard.

MPAI defines  3 Interoperability Levels of an AIW:

Level 1 – The AIW Conforms with the MPAI-AIF Standard.

Level 2 – The AIW Conforms with the MPAI-AIF Standard and an Application-oriented Technical Specification.

Level 3 – The AIW Conforms with the MPAI-AIF Standard, an Application-oriented Technical Specification, and has been assessed for Performance by a Performance Assessor.

The MPAI Store Tests the Conformance of a submitted AIW implementation to properly label it as a Level 1, Level 2, or Level 3 Implementation, and making it available for Distribution.

The Conformance Testing Specification is usually integrated with the Technical Specification. When it is published as a stand-alone document it shall include:

  1. Normative chapters:
    1. Scope.
    2. Definitions.
    3. References.
    4. The Means to Test the Conformance of the relevant AIMs and/or AIW.
  2. Informative chapters
    1. Foreword.
    2. Introduction.

5       Performance Assessment Specifications

As specified in Ecosystem, Performance Assessors perform Performance Assessment of Implementations based on Performance Assessment Specification the Implementor makes reference to.

Performance is an umbrella term used to describe a variety of attributes – some specific of the application domain the Implementation intends to address. Therefore, Performance Assessment Specifications provide methods and procedures to measure how well an AIW or an AIM performs its function. Performance of an Implementation includes methods and procedures for all or a subset of the following characteristics:

  1. Quality – for instance, how well a Face Identity Recognition AIM recognises faces, how precise or error-free are the changes in a Visual Scene detected by a Visual Change Detection AIM, or how satisfactory are the responses provided by an Answer to Multimodal Question AIW.
  2. Robustness – for instance, how robust is the operation of an implementation with respect to duration of operation, load scaling, etc.
  3. Extensibility – for instance, the degree of confidence a user can have in an Implementation when it deal with data outside of its stated application scope.
  4. Bias: – for instance, how dependent on specific features of the training data is the inference, as in Company Performance Prediction when the accuracy of the prediction may widely change based on the size or the geographic position of a Company; or face recognition in Television Media Analysis.
  5. Legality – for instance, in which jurisdictions the use of an AIM or an AIW complies with a regulation, e.g., the European AI Act.
  6. Ethics: may indicate the conformity of an AIM or AIW to a target ethical standard.

A Performance Assessment Specification shall:

  1. Define which of the six characteristics mentioned above, or any other application-specific instance not listed above, are addressed by the Specification.
  2. Include the Means, i.e., procedures, tools, data sets and/or the specification of suitable data sets –  for use in Assessing the Performance of an implementation.
  3. Specify the minimum amount of information that an Implementer shall provide to the Performance Assessor regarding their Implementation.
  4. Specify the nature and minimum amount of feedback that a Performance Assessor shall disclose to an Implementer.

The Performance Assessment Specification is usually integrated with the relevant Technical Specification. When it is published as a stand-alone document it shall include:

  1. Normative chapters:
    1. Scope.
    2. Definitions.
    3. References.
    4. Any chapter providing the content of the Performance Assessment.
  2. Informative chapters
    1. Foreword.
    2. Introduction.

6       Technical Report

Technical Reports are descriptions of contexts, technical issues and possible solutions regarding an application area, such as “implementation guidelines” of a Technical Specification.

A Technical Report is published as a stand-alone document or a set of web pages. It shall include:

  1. Normative chapters:
    1. Scope
    2. Definitions.
    3. References
    4. Any chapter providing the content of the Technical Report.
  2. Informative chapters
    1. Foreword
    2. Introduction.

<-References      Go to ToC      Ecosystem->