Seven good reasons to join MPAI
MPAI, the international, unaffiliated, non-profit organisation developing AI-based data coding standards with clear Intellectual Property Rights licensing frameworks, is now offering those wishing to join MPAI the opportunity to start their 2023 membership two months in advance, from the 1st of November 2022.
Here are six more good reasons why you should join MPAI now.
- In a matter of months after its establishment in September 2020, MPAI has developed 5 standards. Now it is working to extend 3 of them (AI Framework, Context-based Audio Enhancement, and Multimodal Conversation), and to develop 2 new standards (Neural Network Watermarking and Avatar Representation and Animation). More in the latest press release.
- MPAI enforces a rigorous standards development process and offers an open route to convert – without modification – its specifications to IEEE standards. Four MPAI standards – AI Framework (P3301), Context-based Audio Enhancement (P3302), Compression and Understanding of Industrial Data (P3303), and Multimodal Conversation (P3304) – are expected to become IEEE standards in a matter of weeks,.
- MPAI has proved that AI-based standards in disparate technology areas – execution of AI application, audio, speech, natural language processing, and financial data – can be developed in a timely manner. It is currently developing standards for avatar representation and animation, and neural network watermarking. More projects are in the pipeline in health, connected autonomous vehicles, short-medium and long-term video coding, online gaming, extended reality venues, and the metaverse.
- MPAI role extends from an environment to develop standards to a stepping stone to make its standards practically and timely usable. In a matter of months after standard approval, patent holders have already selected a patent pool administrator for some MPAI standards.
- MPAI is the root of trust of an ecosystem specified by its Governance of the MPAI Ecosystem grafted on its standards development process. The ecosystem includes a Registration Authority where implementers can get identifiers for their implementations, and the MPAI Store, a not-for-profit entity with the mission to test the security and conformance of implementations, make them available for download and publish their performance as reported by MPAI-appointed Performance Assessors.
- MPAI works on leading-edge technologies and its members have already been given many opportunities to publish the results of their research and standard development results at conferences and in journals.
Join the fun, build the future
The Metaverse, from the standard viewpoint
Much before the world was charmed by the prospects of the magnificent and progressive fates of the Metaverse, MPAI was already engaged in a Metaverse Project, because the project was an excellent catalyst to achieve several objectives:
- To organise dispersed but correlated activities.
- To identify and characterise new issues.
- To develop a Metaverse Reference Model.
- To position MPAI activities in the Metaverse Reference Model.
This activity, started in January 2022, has since gathered momentum an now MPAI is well advanced in the project to develop the “MPAI Metaverse Model (MMM)” that it intends to publish as a Technical Report at its 27th General Assembly (MPAI-27) on 21 December 2023. This news can only be a pale reflection of the current 40+ pages of the MMM draft but is published here to give advance information and to recruit more participation in the development of a document that will be of crucial importance for the future of MPAI and, arguably, of the industry at large.
The MMM makes 11 assumptions which are laid as the basis of the MMM.
- Metaverse Specification. There will eventually be a collection of Metaverse specifications called Common Metaverse Specifications (CMS), probably developed by different Standard Developing Organisations and hopefully based on an agreed master plan. The MMM intends to be the first step toward such a master plan by identifying and describing the functionalities and corresponding standard technologies required to build Interoperable Metaverse Instances. Metaverse Instance will be implementation of Metaverse interoperability levels called CMS Profiles with the Interoperability degree enabled by those Profiles.
- Metaverse Profiling. The industry will be able to implement independent Metaverse Instances based on specific CMS Profiles. Therefore, a Metaverse Instance can implement:
- The full CMS set and be able to interoperate with any other Metaverse Instance.
- A CMS subset in three different configurations:
- With no additional technologies and able to interoperate with other Metaverse Instances for the available functionalities.
- Replacing some CMS functionalities with proprietary technologies and able to interoperate with other Metaverse Instances for the CMS-conforming functionalities.
- Adding new proprietary functionalities and unable to interoperate with other Metaverse Instances for such functionalities.
- No CMS functionality and unable to interoperate with any other Metaverse Instance.
- Metaverse structure. In general, a Metaverse Instance will have:
- A Metaverse Manager who owns, operates and maintains the Metaverse.
- Metaverse Operators running Metaverse Environments under Metaverse Manager supervision.
- Metaverse Partners who act in Metaverse Environments.
- Other Users including End Users.
- Representation and Presentation. There is a clear distinction between the way information is digitally represented. E.g., a Digital Human is a Digital Object represented as bits in a Metaverse Instance and an Avatar is a rendered Digital Human perceived as a physical stimulus.
- Regulation and Governance. Depending on applicable law, the operation of a Metaverse Instance will be regulated and governed. However, the Metaverse may also be governed.
- Metaverse layers. In general, a Metaverse has a layered structure with at least 3 layers: an infrastructure layer and an experience layer interspersed with one or more Metaverse-specific service layers. The MMM splits such intermediate (service) layer in two: Native services (Platform Layer) and Subsidiary services (Enabling Service Layer). A Metaverse Instance may also interact with application-specific services, e.g., traffic, weather, etc. They are not part of the Enabling Service Layer but connected to it as External Services.
- The extent of the Metaverse. A Metaverse Instance is connected to a Universe Environment via Sensors and Actuators that are an integral part of it. However, a Universe Environment can have its own Sensors and Actuators to access the Metaverse.
- The Metaverse is not just for humans. MPAI is already dealing with use cases where the notion of Metaverse can be applied even without participation of human Users. This is the case, e.g., of a Connected Autonomous Vehicle (CAV) creating, or contributing data to a Metaverse Instance intended to be an accurate Representation of the Universe Environment where a CAV happens to be.
- The Metaverse is an asymptotic point. The Metaverse is unlikely to be declared born at a future magic point in time when the first Metaverse Instance will be “turned on”. The Metaverse is a continuously evolving notion supported by an evolving CMS that enables the industry to gradually implement it.
- The Metaverse is Digital. Three key adjectives are used consistently:
- “Digitised” is a data structure representing an object in the Universe, e.g., a Digitised Human represents a digitised replica of a human that can be rendered as an avatar.
- “Virtual” is a data structure created by a computer, e.g., a Virtual Human is a data structure that can be rendered as an avatar.
- “Digital” refers to both “Digitised” and “Virtual”.
- Scene and Object hierarchy. A Universe Environment is perceived as a scene containing objects. When mapped to a Metaverse Environment, the objects and the scene have a digital representation shared by all Digital Scenes, whether they are Digitised or Virtual. Objects and Scenes have a digital representation composed of four types of data:
- Security Data that guarantees the identity of an object.
- Private Data related to objects and scenes and accessible by specific Users.
- Public Data related to objects and scenes and accessible by all Users.
- Perceivable Presentation Data that is used to render objects and scenes (e.g., a collection of avatar models that a User selects as their current Persona).
This is just ~15% of the current MMM document. You are welcome to join MPAI to participate in the MPAI Metaverse Project.
Meetings in the coming November meeting cycle
Non-MPAI members may join the meetings given in italics in the table below. If interested, please contact the MPAI secretariat.
|Group name||31 Oct- 4 Nov||7-11 Nov||14-18 Nov||21-25 Nov||Time (UTC)|
|AI-based End-to-End Video Coding||9||23||14|
|AI-Enhanced Video Coding||2||16||14|
|Artificial Intelligence for Health Data||18||14|
|Avatar Representation and Animation||3||10||17||13:30|
|Connected Autonomous Vehicles||2||9||16||23||13|
|Context-based Audio enhancement||1||8||15||22||16|
|Governance of MPAI Ecosystem||31||7||14||21||16|
|Industry and Standards||4||18||16|
|MPAI Metaverse Model||4||11||18||15|
|Neural Network Watermaking||1||8||15||22||15|
|Server-based Predictive Multiplayer Gaming||3||10||17||14:30|
|General Assembly (MPAI-26)||23||15|