MPAI publishes the MPAI Metaverse Model (MPAI-MMM) for Community Comments

MPAI is pleased to announce that, after a full year of efforts, it has been able to publish the MPAI Metaverse Model, the master plan of a project designed to facilitate the establishment of standards promoting Metaverse Interoperability.

 

Please look at the online presentation material of MPAI-MMM:

YouTube video    Non-YouTube video   The MPAI Metaverse Model WD0.5

 

The industry is showing a growing interest in the Metaverse that is expected to create new jobs, opportunities, and experiences with transformational impacts on virtually all sectors of human interaction.

Standards and Artificial Intelligence are widely recognised as two of the main drivers for the development of the Metaverse. MPAI – Moving Picture, Audio, and Data Coding by Artificial Intelligence – plays a role in both thanks to its status as an international, unaffiliated, non-profit organisation developing standards for AI-based data coding with clear Intellectual Property Rights licensing frameworks.
The MMM is a full-bodied document divided in 9 chapters.

  1. Introduction gives a high-level overview of the MMM and explains that the MMM is published for community comments where MPAI posts the MMM, anybody can send comments and contributions to the MPAI Secretariat, MPAI considers them, and publishes the MMM in final form on 25 January.
  2. Definitions gives a comprehensive set of Metaverse-related terms and definitions.
  3. Assumptions introduces 16 assumptions that the proposed Metaverse standardisation process will adopt. Some of them are: the steps of the standadisation process, the availability of Common Metaverse Specifications (CMS), the eventual development of Metaverse Profiles, a definition of Metaverse Instance and Interoperability, the layered structure of a Metaverse Instance, the fact that Metaverse Instances already exist, and the definition of Metaverse User.
  4. Use Cases collects a large number of application domains, such as Automotive, Education, Finance, Healthcare, and Retail that will benefit from the use of the Metaverse. They are analysed to derive Metaverse Functionalities.
  5. External Services collects some of the services that a Metaverse Instance may require either as a platform native or as an externally provided service. Examples are: content creation, marketplace, and crypto wallets. They are analysed to derive Metaverse Functionalities.
  6. Functionalities is a major element of the MMM in its current form. It collects a large number of Functionalities that a Metaverse Instance may support depending on the Profile it adopts. It is organised in 9 areas, i.e., Instance, Environment, Content Representation, Perception of the Universe by the Metaverse, Perception of the Metaverse by the Universe, User, Interaction, Information search, and Economy support. Each area is organised in subareas: e.g., Instance is subdivided into Management, Organisation, Features, Storage, Process Management, and Security. Each subarea provides the Functionalities relevant to that subarea, e.g., Process Management includes the following Functionalities: Smart Contract, Smart Contract Monitoring, Smart Contract Interoperability.
  7. Technologies has the challenging task of verifying how well technologies match the requirements of the Functionalities. Currently, the following Technologies are analysed: Sensory information – namely, Audio, Visual, Touch, Olfaction, Gustation, and Brain signals; Data processing – how can we cope with the end of Moore’s Law and with the challenging requirements for distributed processing; User Devices – how Devices can cope with challenging motion-to-photon requirements; Network – the prospects of networks providing services satisfying high-level requirements, e.g., latency and bit error rate; and Energy – the prospects of energy storage for portable devices and of energy consumption caused by thousands of Metaverse Instances and potentially billions of Devices.
  8. Governance identifies and analyses two areas: technical governance of the Metaverse System if the industry decides that this level of governance is in the common interest, and governance by public authorities operating at a national or regional level.
  9. Profiles provides an initial roadmap from the publication of the MMM to the development of Profiles through the development of the Metaverse Architecture, Functional Requirements of Data types, Common Metaverse Specification Table of Contents, mapping of MPAI standard Technologies into the CMS, inclusion of all required Technologies, and the drafting of the mission of the Governance of the Metaverse System.

The MMM is a large integrated document. Comment on the MMM and join MPAI to make it happen!

Join MPAI – Share the fun – Build the future

MPAI publishes Neural Network Watermarking (MPAI-NNW) for Community Comments

MPAI-NNW specifies methodologies to evaluate the following aspects of a neural network watermarking technology:

  • The impact of the technology on the performance of a watermarked neural network and its inference.
  • The ability of a neural network watermarking detector/decoder to detect/decode a payload when the watermarked neural network has been modified.
  • The computational cost of injecting, detecting or decoding a payload in the watermarked neural network.

The standard assumes that:

  • The neural network watermarking technology to be evaluated according to this standard is publicly available.
  • The watermarking key is unknown during evaluation.
  • The performance of the neural network watermarking technology does not depend on a specific key.

Please look at the online presentation material of MPAI-MMM:

YouTube video    Non-YouTube video    Ppt presentation: MPAI-NNW WD3.0

 

According to its process, MPAI publishes this draft Technical Specification for Community Comments. Anybody can send comments and contributions to the MPAI Secretariat. MPAI will consider them and will publish the MPAI-NNW Technical Specification in final form on 25 January.

Meetings in the coming January meeting cycle

Non-MPAI members may join the meetings given in italics in the table below. If interested, please contact the MPAI secretariat.

Group name 22-23 Dec 26-30 Dec 2-6 Jan 9-13 Jan 16-21 Jan 24-28 Jan Time
(UTC)
AI Framework 9 16 24 16
AI-based End-to-End Video Coding 4 18 14
AI-Enhanced Video Coding 11 26 14
Artificial Intelligence for Health Data 13 14
Avatar Representation and Animation 5 12 20 13:30
Communication 5 20 15
Connected Autonomous Vehicles 28 4 11 18 26 15
Context-based Audio enhancement 3 10 17 25 17
Governance of MPAI Ecosystem 3 17 16
Industry and Standards 6 21 16
MPAI Metaverse Model 23 30 6 13 21 15
Multimodal Conversation 3 10 17 25 14
Neural Network Watermaking 3 10 17 25 15
Server-based Predictive Multiplayer Gaming 5 12 20 14:30
XR Venues 27 3 17 25 18
General Assembly (MPAI-27)           26 15