<-References        Go to ToC       AI Modules->

1. Technical Specification 2. Reference Software 3. Conformance Testing 4. Performance Assessment

1. Technical Specification

Technical Specification: Context-based Audio Enhancement (MPAI-CAE) – Use Cases (CAE-USC) V2.4 assumes that Workflow implementations will be based on Technical Specification: AI Framework (MPAI-AIF) V2.2 specifying an AI Framework (AIF) where AI Workflows (AIW) composed of interconnected AI Modules (AIM) are executed.

Table 1 provides the full list of AIWs specified by CAE-USC V2.4 with links to the pages dedicated to AI Workflows. Each of these includes Function; Reference Model; Input/Output Data; Functions of AIMs; Input/Output Data of AIMs; AIW, AIMs, and JSON metadata; Reference Software; Conformance Testing; and Performance Assessment.

All AI-Workflows specified by CAE-USC V2.3 (i.e., the preceding version) are superseded by those specified by CAE-USC V2.4. AI-Workflows specified by CAE-USC V2.3 may still be used if their version is explicitly indicated.

Table 1 – AIWs of CAE-USC V2.4

Acronym Name JSON Acronym Name JSON
CAE-ARP Audio Recording Preservation X CAE-EAE Enhanced Audioconference Experience X
CAE-EES Emotion-Enhanced Speech X CAE-SRS Speech Restoration System

2. Reference Software

As a rule, MPAI provides Reference Software implementing the AIWs released with the following disclaimers:

  1. The CAE-USC V2.4 Reference Software Implementation, if in source code, is released with the BSD-3-Clause licence.
  2. The purpose of this Reference Software is to provide a working Implementation of CAE-USC V2.4, not to provide a ready-to-use product.
  3. MPAI disclaims the suitability of the Software for any other purposes and does not guarantee that it is secure.
  4. Use of this Reference Software may require acceptance of licences from the respective copyright holders. Users shall verify that they have the right to use any third-party software required by this Reference Software.

Note that at this stage the CAE-USC V2.4 specifies Reference Software only for some AIWs.

3. Conformance Testing

An implementation of an AIW conforms with CAE-USC V2.4 if it accepts as input _and_ produces as output Data and/or Data Objects (the combination of Data of a Data Type and its Qualifier) conforming with those specified by ACAE-USC V2.4.

The Conformance is expressed by one of the two statements

  1. “Data conforms with the relevant (Non-MPAI) standard” – for Data.
  2. “Data validates against the Data Type Schema” – for Data Object.

The latter statement implies that:

  1. Any Sub-Type of the Data conforms with the relevant Sub-Type specification of the applicable Qualifier.
  2. Any Content and Transport Format of the Data conform with the relevant Format specification of the applicable Qualifier.
  3. Any Attribute of the Data
    1. Conforms with the relevant (Non-MPAI) standard – for Data, or
    2. Validates against the Data Type Schema – for Data Object.

The method to Test the Conformance of an instance of Data or Data Object is specified in the Data Types chapter.

Note that at this stage the CAE-USC V2.4 specifies Conformance Testing only for some AIWs.

4. Performance Assessment

Performance is an umbrella term used to describe a variety of attributes – some specific of the application domain the Implementation intends to address. Therefore, Performance Assessment Specifications provide methods and procedures to measure how well an AIW performs its function. Performance of an Implementation includes methods and procedures for all or a subset of the following characteristics:

  1. Quality– for example, how satisfactory are the responses provided by an Answer to Multimodal Question
  2. Robustness– for example, how robust is the operation of an implementation with respect to duration of operation, load scaling, etc.
  3. Extensibility– for example, the degree of confidence a user can have in an Implementation when it deals with data outside of its stated application scope.
  4. Bias – for example, how dependent on specific features of the training data is the inference, as in Company Performance Prediction when the accuracy of the prediction may widely change based on the size or the geographic position of a Company.
  5. Security – for example, the machine driven by an AI System does not create risks for its users, e.g., physical and cyber.
  6. Legality– for example, an AIW instance complies with a regulation, e.g., the European AI Act.

Note that at this stage the CAE-USC V2.4 specifies Performance Assessment only for some AIWs.

<-References        Go to ToC       AI Modules->