<– Functional Requirements of Data Types   Go to ToC    Functionality Profiles->

Introduction Virtual Lecture Hybrid working   Virtual performance Virtual Dance Drive a Connected Autonomous Vehicle
Use Case Description Language Virtual Meeting eSports Tournament   AR Tourist Guide Virtual Car Showroom

11.1        Introduction

This Informative Chapter collects diverse Metaverse Use Cases where Users request to perform Actions on different types of Items. The goal of this Chapter is to show that the Metaverse elements of this Technical Specification do indeed support a range of representative Use Cases.

11.2       Use Case Description Language

Metaverse Use Cases involve a plurality of Processes – Users, Devices, Services, Apps – performing or requested by other Processes to perform Actions on a variety of Items to produce other Items.

In a Use Case:

  1. Processes (e.g., Users) are sequentially identified by one subscript.
  2. Items Acted on by a Process are identified by the subscript of the Process performing an Action on the Item followed by a sequential number.
  3. The Locations where the Actions take place are similarly identified by the subscript of the Process performing an Action at the Location followed by a sequential number.
  4. If the Actions are performed at different M-Instances, all Processes, Items, and Locations are prefixed by a sequential capital letter.

 

For instance:

  1. Useri MM-Embeds Personaj at M-Locationi.k.
  2. Useri MU-Renders Entityj at U-Locationi.k.
  3. UserA,i MM-Sends Objecti.j to UserB.k.

 

All Use Cases assume that Actions are performed in an M-Instance. When they are performed in the Universe, this is specifically mentioned.

 

The following abbreviations are used throughout:

MLoc: M-Locations

SA: Spatial Attitude

ULoc: U-Locations

 

Note:   Persona(AV) is a Persona that can be audio-visually perceived.

Object(AVH) is an Object that can be audio-visual-haptically perceived.

11.3        Virtual Lecture

11.3.1        Description

A student attends a lecture held by a teacher in a classroom created by a school manager:

  1. School Manager
    • Authors and embeds a virtual classroom.
    • Pays the teacher.
  2. Teacher
    • Is at home.
    • Embeds a persona of theirs from home at the classroom’s desk.
    • Embeds and animates a 3D model.
    • Leaves the classroom.
  3. Student
    • Is at home.
    • Pays to attend and to make a copy of their lecture Experience.
    • Embeds a persona of their in the classroom.
    • Approaches the teacher’s desk to feel the 3D model with haptic gloves.
    • Stores their lecture Experience.
    • Leaves the classroom and returns home.

11.3.2        Processes, Items, and Locations

 

User1 School Manager
Object(V)1.1 Classroom MLoc1.1 Location of classroom
Value1.1 Lecture consideration
User2 Teacher
Persona(AV)2.1 Teacher’s Persona MLoc2.1 Teacher’s home
MLoc2.2 Desk in classroom
Model2.1 Model for experiment MLoc2.3 Experiment place
Process1 Animates Model2.1
User3 Student
Persona(AV)3.1 MLoc2.1 Student’s home
Value3.1 Lecture fees
MLoc3.1 Place in classroom
MLoc3.3 Close to Experiment place
Experience3.1 Recorded Experience Address3.1 Storing Experience

11.3.3        Detailed workflow

  1. User1 (Manager):
    • Authors Object(V)1.
    • MM-Embeds Object(V)1 at MLoc1.1.
  2. User2 (Teacher):
    • Tracks Persona(AV)1 at MLoc2.1 with SA.
    • MM-Embeds Persona1 at MLoc2.2.
    • MM-Disables Persona1 at MLoc2.1.
    • MM-Embeds Model(AVH)1 at MLoc2.3 (Experiment place, close to MLoc2.2).
  3. User3 (Student):
    • Tracks Persona(AV)1 at MLoc3.1 with SA.
    • Transacts Value1.
    • MM-Embeds Persona(AV)1 at MLoc3.2 with SA.
    • MM-Disables Persona(AV)1 at MLoc3.1.
  4. User2 (Teacher):
    • MM-Animates Model1 with Process1.
  5. User3 (Student)
    • MM-Adds Persona(AV)1 at MLoc3.3 (Experiment place).
    • MM-Sends Model1 to User3.
    • MU-Sends Experience1 to Address3.1.
  6. User1 (Manager):
    • Transacts Value1 to User2 (Teacher).
  7. User2 (Teacher):
    • MM-Disables Persona1 from MLoc2.2
    • MM-Embeds Persona1 at MLoc2.1.
  8. User3 (Student)
    • MM-Disables Persona1 from MLoc3.2
    • MM-Embeds Persona1 at MLoc3.1.

11.3.4        Workflow and Actions

 

Table 5 – Virtual Lecture workflow and Actions.

User1 (Manager) Authors Object(V)1.1 (Classroom).
MM-Embeds Object(V)1.1 MLoc1.1.
User3 (Teacher) Tracks Persona (AV)3.1 MLoc3.1 w/ SA.
MM-Embeds Persona3.1 MLoc3.2 (desk).
MM-Disables Persona3.1 MLoc3.1
MM-Embeds Model3.1 MLoc3.3 (close to desk).
MM-Animates Model3.1.
User2 (Student) Tracks Persona (AV)2.1 MLoc2.1 w/ SA.
Transacts Value2.1 (Lecture & Experience)
MM-Embeds Persona (AV)2.1 Location2.2 with SA.
MM-Disables Persona2.1 MLoc2.1.
User3 (Teacher) MM-Embeds Model (AVH)3.1 MLoc3.3 (close to desk).
MM-Animates Model3.1.
User2 (Student) MM-Adds Persona (AV)2.1 MLoc2.3 (close to desk).
MM-Sends Model (AVH)3.1
MU-Sends Experience2.1 Address2.1
User1 (Manager) Transacts Value1.1 User3 (Lecture consideration).
User3 (Teacher) MM-Disables Persona3.1 MLoc3.2
MM-Embeds Persona3.1 MLoc3.1
User2 (Student) MM-Disables Persona2.1 MLoc2.2
MM-Embeds Persona2.1 MLoc2.1

11.3.5        Actions, Items, and Data Types

Table 6 gives the list of Actions, Items, and Data Types used by the Virtual Lecture Use Case. The Table also gives the Actions implied by the Track Composite Action (MM-Embed, MM-Animate, MM-Send, MU-Render, UM-Capture, MU-Send, and Identify). The list of these Actions will not be repeated in the next tables.

 

Table 6 – Virtual Lecture Actions, Items, and Data Types.

Actions Items Data Types
Author Object(V) Amount
Identify Experience Coordinates
MM-Animate M-Location Currency
MM-Disable Persona(AV) Spatial Attitude
MM-Embed U-Location Value
MM-Send Value Orientation
MU-Render Position
MU-Send
UM-Capture
UM-Send
Track
Transact

11.4        Virtual Meeting

11.4.1        Description

A meeting manager

  1. Authors a meeting room.
  2. Deploys a Virtual Secretary tasked to produce a summary of the conversations, enriched by information about participants’ Personal Statuses.

A participant

  1. Attends a meeting held in the room.
  2. Gets a translation of sentences uttered in languages other than their own.
  3. Makes a presentation using a 3D model.

11.4.2        Processes, Items, and Locations

 

User1 Meeting Manager
Object(V)1.1 Classroom MLoc1.1 Location of meeting room
Persona(AV)1.1 Virtual Secretary MLoc1.2 Virtual Secretary’s Location
Summary1.1 Meeting Summary MLoc1.3 Location s Summary display.
User2 Meeting participant #1
Persona(AV)2.1 participant #1’s Persona MLoc2.1 Participant’s home
MLoc2.2 In the meeting room
Model2.1 Model for presentation MLoc2.3 Location of presentation display.
Event2.1 Entire meeting
Process1 Animates Model2.1
User3 Meeting participant #2
Persona(AV)3.1 participant #2’s Persona MLoc3.1 In the meeting room
Object(A) 3.1 Speech segment

11.4.3        Detailed workflow

  1. User1 (Meeting Manager)
    • Authors Object(V)1 (meeting room).
    • MM-Embeds Object(V)1 at MLoc1.1.
    • MM-Embeds Persona(AV)1 (a Virtual Secretary) at MLoc1.2.
    • MM-Animates Persona(AV)1.
  2. User2 (1st Meeting participant):
    • Tracks Persona(AV)1 at MLoc2.1 (its home).
    • MM-Embeds Persona(AV)1 at MLoc2.2 (enters meeting room.).
    • MM-Disables Persona(AV)1 from MLoc2.1 (disappears from home).
  3. User3 (2nd meeting participant)
    • MM-Embeds Persona(AV)1 at MLoc2.1.
    • MM-Sends Object(A)1.
  4. User2 (1st Meeting participant):
    • Authenticates Object(A)1.
    • Interprets (requests translation of Object(A)1).
    • MM-Embeds Entity1 (a 3D model) at MLoc2.3 (in meeting room).
    • MM-Animates Entity1 with Process1 (makes presentation with 3D model).
  5. Virtual Secretary:
    • Interprets Persona1’s Object(A) 3.1 (request speech recognition and extraction of Personal Status displayed by Persona3.1.
    • Produces Summary1 of Persona3.1’s Object(A) 3.1 (with added graphical signs expressing Persona3.1’s Personal Status).
    • MM-Embeds Summary1 at MLoc1.3 (in meeting room for participants to comment).
    • User1 removes Persona1.
    • User2 removes Persona1 and returns home.
    • User1 removes Persona1 and returns home.

11.4.4        Workflow and Actions

 

Table 7 – Virtual Meeting workflow and actions.

Who Does What Where/comment
User1 (Manager) MM-Embeds Object(V)1.1 (Meeting room) MLoc1.1
MM-Embeds Persona1.1 (Virtual Secretary) MLoc1.2
MM-Animates Persona1.1 Animates Virtual Secretary.
User2 (Participant) Tracks Persona2.1 (AV) At MLoc2.1 w/ SA
MM-Embeds Persona2.1 (AV) At MLoc2.2 w/ SA
MM-Disables Persona2.1 (AV) From MLoc2.1
User3 (Participant) Tracks Persona3.1 (AV) At MLoc3.1 w/ SA
MM-Embeds Persona3.1 (AV) At MLoc3.2 w/ SA
MM-Disables User3 From MLoc3.1
MM-Sends Object(A)3.1 (Speaks)
User2 (Participant) Authenticates Object(A)3.1
Interprets Object(A)3.1 (Requests translation)
MM-Embeds Model2.1 At MLoc2.2 (3D presentation)
MM-Animates Model2.1
Virtual Secretary Interprets Object(A)3.1 (With Personal Status)
Produces Summary1.1
MM-Embeds Summary1.1 At MLoc1.3 (Meeting room)
User1 (Manager) MM-Disables Persona1.1 At MLoc1.2
User2 (Participant) MU-Sends Event2.1 To Address2.1
MM-Embeds Persona2.1 (AV) At MLoc2.1 (home)
MM-Disables Persona2.1 (AV) From MLoc2.2
User3 (Participant) MM-Embeds Persona3.1 (AV) At MLoc2.1 (home)
MM-Disables Persona3.1 (AV) From MLoc3.2

11.4.5        Actions, Items, and Data Types

Table 8 gives the list of Actions, Items, and Data Types used by the Virtual Meeting Use Case. For simplicity, the Actions implied by the Track Action have not been added to the Table.

 

Table 8 – Virtual Meeting Actions, Items, and Data Types.

Actions Items Data Types
Authenticate Event Coordinates
Interpret Object(AV) Orientation
MM-Animate Object(V) Position
MM-Disable Persona(AV) Spatial Attitude
MM-Embed Summary
MM-Send
Track

11.5        Hybrid working

11.5.1        Description

A company applies mixed in-presence and remote working policy.

  1. Some Workers (R-Workers) attend Company physically.
  2. Some Workers (V-Workers) attend Company virtually.
  3. All Workers
    • Are Authenticated.
    • Are present in the Virtual office.
    • Communicate by sharing AV messages (Communication of R-Workers’ Personae is also mapped to the M-Environment).
    • Participate in Virtual meetings.

11.5.2        Processes, Items, and Locations

 

User1 Company Manager
Object(V)1.1 Office space MLoc1.1 (Location of company office).
Persona(AV)1.1 Gatekeeper MLoc1.2 (Location of company gateway).
Process1 Animates Persona(AV)1.1
User2 R-Worker #1
Persona(AV)2.1 R-Worker #1’s Persona MLoc2.1 (Participant’s home)
MLoc2.2 (Office desk)
Model(AVH)2.1 Whiteboard MLoc2.3 (Location at Meeting room)
Process2 Animates whiteboard
User3 V-Worker #1
Persona(AV)3.1 V-Worker #1’s Persona MLoc3.1 (Participant’s home).
  MLoc3.2 (Office desk).
Object(A) 3.1 Speech segment

11.5.3        Detailed workflow

  1. User1 (Manager):
    • Authors Object(V)1 (Virtual office).
    • Embeds Object(V)1 at MLoc1.1.
    • Embeds Persona(AV)1 at MLoc1.2 (Office gateway)
    • MM-Animates Persona1 with Process1 to act as gatekeeper.
  2. User2 (R-Worker #1):
    • MM-Adds Persona(AV)1 at MLoc2.1.
  3. R-Worker #1:
    • Comes to real office.
  4. Process1 (Manager):
    • Authenticates R-Worker #1.
  5. User2 (R-worker):
    • MM-Embeds Persona(AV)1 at MLoc2.1 (Office desk).
  6. User3 (V-worker):
    • Tracks Persona(AV)1 at MLoc3.1
  7. Process1 Authenticates:
    • User3 (V-worker).
  8. User3 (V-worker):
    • MM-Embeds Persona(AV)1 at MLoc3.2 (Office desk).
    • MM-Disables Persona(AV)1 at MLoc3.1
    • MM-Sends Object1 (A) to User2 (R-worker).
    • MM-Embeds Persona(AV)1 at MLoc3.3 (close to R-worker’s desk).
    • MM-Disables Persona(AV)1 at MLoc3.2 (own office desk).
    • MM-Embeds Persona(AV)1 at MLoc3.4 (Meeting room).
    • MM-Disables Persona(AV)1 at MLoc3.3
  9. User2 (R-worker)
    • MM-Embeds Model(AVH)1 (Whiteboard) at MLoc2.2 (Meeting room).
    • MM-Animates Whiteboard with Process2.
    • MM-Disables Persona(AV)1 at MLoc2.2 (Meeting room).
  10. User3 (V-worker):
    • MM-Embeds Persona(AV)1 at MLoc3.1 (Home).
    • MM-Disables Persona(AV)1 at MLoc3.1 (Meeting room).

11.5.4        Workflow and Actions

 

Table 9 – Hybrid Working workflow and actions.

Who Does What Where/comment
User1 (Manager) MM-Embeds Object(V)1.1 MLoc1.1 (Company Office)
MM-Embed Persona(AV)1.1 MLoc1.2 (Gatekeeper)
MM-Animates Persona(AV)1.1 MLoc1.2
human2 (Enters company)
User2 (R-Worker) Tracks Persona(AV)2.1 MLoc2.1 (Office desk)
User1 (Gatekeeper) Authenticates Object(AV)1.1 (AV of R-Worker #1)
User3 (V-Worker) Tracks Persona(AV)3.1 MLoc3.1 (home)
MM-Embeds Persona3.1 MLoc3.2 w/ SA (Office desk)
MM-Sends Objects(A)3.1 To Persona(AV)2.1
MM-Embeds Persona(AV)3.1 MLoc3.3 (talk “in person”)
MM-Disables Persona(AV)3.1 MLoc3.2
MM-Embeds Persona3.1 MLoc3.4 (Meeting room)
MM-Disables Persona(AV)3.1 MLoc3.3
User2 (R-Worker) MM-Embeds Persona(AV)2.1 MLoc3.4 (Meeting room)
MM-Disables Persona(AV)3.1 MLoc2.2 (Meeting room)
MM-Embeds Object(AVH)2.1 MLoc2.3 (Whiteboard)
MM-Animates Object(AVH)2.1 Operates Whiteboard
MM-Disables Persona(AV)2.1 From MLoc3.4
User3 (V-Worker) MM-Embeds Persona(AV)3.1 MLoc3.1 (back home)
MM-Disables Persona(AV)3.1 From MLoc3.4

11.5.5        Actions, Items, and Data Types

 

Table 10 – Hybrid Working Actions, Items, and Data Types

Actions Items Data Types
Authenticate Object(V) Coordinates
MM-Animate M-Location Orientation
MM-Disable Object(A) Position
MM-Embed Objrct(AVH) Spatial Attitude
MM-Send Persona(AV)
Track

11.6        eSports Tournament

11.6.1        Description

  1. A site manager
    1. Develops a game landscape.
    2. Makes it available to a game manager.
  2. The game manager
    1. Deploys autonomous characters.
    2. Places virtual cameras and microphones in the landscape.
  3. Captured AV from game landscape is displayed onto a dome screen and streamed online.

11.6.2        Processes, Items, and Locations

User1 Site Manager
Object(AVH)1.1 Game landscape MLoc1.1 (Location of Game landscape).
User2 Game manager
Personae2.i Autonomous characters M-Loc2.i Location in Game landscape
Scene2.1 Game’s Scene
Userj Players
Personaej.1 Players’ characters M-Locj.1 Location in Game landscape
Process2.i Autonomous char. animation
Service1 Camera/microphone control
Device1 Dome screen
Devicek Device of online human.

11.6.3        Detailed workflow

  1. User1 (Site Manager)
    • Authors Object(AVH)1 (game landscape).
    • MM-Embeds Object(AVH)1 (game landscape) at M-Loc1.1.
  2. User2 (Game Manager)
    • MM-Animates Object(AV)1 with Process1.
    • MM-Embeds Personaei (Autonomous characters) with SA at M-Loc2.i.
    • Calls Processi to provide role-specific:
      • Costumes (e.g., magician, warrior).
      • Forms, physical features, and abilities (e.g., cast spells, shoot, fly, jump).
    • Calls Service1 (virtual camera/microphone control).
  3. Userj (Playerj) Tracks Persona1 (AVH) at MLocj.1 with SA.
  4. User2 (Game Manager):
    • MU-Sends Scene1 composed of
      • Animated Object1 (game landscape).
      • Personaei (Autonomous characters).
      • Personaej (Players).
    • To
      • Device1 (Dome screen)
      • Devicek (Viewers online via streaming).
  1. Device1 MU-Renders Scene1.
  2. Devicek MU-Renders Scene1.

11.6.4        Workflow

 

Table 11 – eSports Tournament workflow and actions.

User1 (Site Manager) Authors Object(AVH)1.1 (Game landscape)
MM-Embeds Object(AVH)1.1 (Landscape) at M-Loc1.1
User2 (Game Manager) Transacts Value2.1 To User1
MM-Embeds Personae2.i (AC) At M-Loc2.i w/SA
MM-Animates Personae2.i (AC) At M-Loc2.i
Userj (Player) Tracks Personaj.1 At MLocj.1 w/ SA
Service1 Controls Camera/microphone
User2 (Game Manager) MU-Renders Scene(AVH)2.1 U-Loc1.1 (via screen).

U-Lock.1 (via streaming).

11.6.5        Actions, Items, and Data Types

 

Table 12 – eSports Tournament Actions, Items, and Data Types.

Actions Items Data Types
Author Object(AVH) Amount
MM-Animate Persona (AVH) Coordinates
MM-Embed Scene(AVH) Currency
MU-Render M-Location Orientation
Track U-Location Position
Transact Value Spatial Attitude

11.7        Virtual performance

11.7.1        Description

  1. Impresario:
    • Acquires Rights to parcel.
    • Authors Auditorium
    • Embeds Auditorium on Parcel.
  2. Participant
    • Buys a ticket for an event with the right to stay close to the performance stage for 5 minutes.
    • Utters a private speech to another participant.
  3. Impresario:
    • Collects participants’ preferences.
    • Interprets participants’ mood (Participants Status).
    • Generates special effects based on preferences and Participants Status.

11.7.2        Processes, Items, and Locations

 

User1 Impresario
Value1.1 Payment for MLoc1.1 MLoc1.1 (Location of Auditorium).
Value1.2 Payment for Object1.1
Value1.3 Consideration for Performance
Object(V)1.1 Auditoriun
Object(A)1.1 SFX
Serrvice1 Collects Preferences
User2 Performer
Persona2 Performer’s Persona M-Loc2.1 (Home)
  M-Loc2.2 (Stage in Auditorium)
User3 Participant #1
Persona3 User3 Persona M-Loc3.1 (Home)
M-Loc3.2 (Seat in Auditorium)
M-Loc3.3
Object3.1(A) Message to Participant #4
Value3.1 Ticket
User4 Participant #2
Persona4 User4 Persona M-Loc4.1 (Home)
M-Loc4.2 (Seat in Auditorium)
Value4.1 Ticket

11.7.3        Detailed workflow

  1. User1 (Organiser)
    • Transacts Value1 (to get Rights to MLoc1.1 (Parcel)).
    • Authors Object1 (Auditorium).
    • Transacts Value2 (to get Rights to Object1.1 (Auditorium)).
    • MM-Embeds Object1 at MLoc1.1.
    • Calls Service1 (to collect Users’ Preferences).
  2. User2 (Performer)
    • Tracks Persona1 at MLoc2.1 (Home)
    • Embeds Persona1 (AV) at MLoc2.2 (in Auditorium) w/ SA.
    • MM-Disables Persona1 from MLoc2.1.
  3. User3 (Participant #1)
    • Tracks Persona1 at MLoc3.1 (at home).
    • Transacts Value1 (buys ticket).
    • Embeds Persona1 (AV) w/ SA at MLoc3.2 (in Auditorium).
    • MM-Disables Persona1 (AV) from MLoc3.1.
  4. User4 (Participant #2)
    • Tracks Persona1 at MLoc4.1 (at home).
    • Transacts Value1 (buys ticket).
    • Embeds Persona1 (AV) w/ SA at MLoc4.2 (in Auditorium).
    • MM-Disables Persona1 (AV) from MLoc4.1.
  5. User3 (Participant #1)
    • MM-Sends Object1(A) to Persona4.1 (Participant #2).
    • Calls Service1 (expresses preferences).
    • MM-Adds Persona1 at MLoc3.3 (close to stage for 5 minutes).
  6. User1 (Organiser)
    • MM-Disables Persona1 from MLoc3.3 (5 minutes passed).
    • Interprets Participants Status (of all participants).
    • MM-Embeds Object(A)1 (SFX).
    • Transacts Value3 to User2 (performance fees).
  7. User2 (Performer)
    • MM-Embeds Persona1 (AV) to MLoc2.1.
    • MM-Disables Persona1 from MLoc2.2.
  8. User3 (Participant #1)
    • MM-Embeds Persona1 (AV) to MLoc3.1.
    • MM-Disables Persona1 from MLoc3.2.
  9. User3 (Participant #2)
    • MM-Embeds Persona1 (AV) to MLoc4.1.
    • MM-Disables Persona1 from MLoc4.2.

11.7.4        Workflow and Actions

 

Table 13 – Virtual Event workflow and actions.

User1 (Impresario) Transacts Value1.1 (Parcel of MLoc1.1)
Authors Object(AV)1.1 (Auditorium)
Transacts Value1.2 (Auditorium)
MM-Embeds Object(V)1.1 M-MLoc1.1
Calls Service1 (Collect Users’ Preferences)
User2 (Performer) Tracks Persona(AV)2.1 MLoc2.1
Embeds Persona(AV)2.1 MMLoc2.2 (Auditorium) w/ SA.
MM-Disables Persona(AV)2.1 At MLoc2.1.
MM-Sends Object(A)2.1 (Performs)
User3 (Participant) Tracks Persona(AV)3.1 MLoc3.1 (at home).
Transacts Value2.1 (Buys ticket).
Embeds Persona(AV)3.1 MLoc3.2 (in Auditorium) w/ SA.
MM-Disables Persona(AV)3.1 MLoc3.1.
User4 (Participant) Tracks Persona(AV)3.1 MLoc3.1 (at home).
Transacts Value2.1 (Buys ticket).
Embeds Persona(AV)3.1 MLoc3.2 (in Auditorium) w/ SA.
MM-Disables Persona(AV)3.1 MLoc3.1.
User3 (Participant) MM-Sends Object(A)3.1 Persona4.1 (Participant).
Calls Service1.1 (Expresses preferences).
MM-Adds Persona(AV)3.1 MLoc3.2 (close to stage).
User1 (Impresario) MM-Disables Persona(AV)3.1 MLoc3.2 (after 5 min).
Calls Service1.1 (Collects preferences).
Interprets Participants Status1.1
MM-Embeds Entities1.i (SFX)
Transacts Value1.2 User2 (performance fees).
User2 (Performer) MM-Embeds Persona(AV)2.1 MLoc2.1.
MM-Disables Persona2.1 MLoc2.2.
User3 (Participant) MM-Embeds Persona(AV)3.1 MLoc3.1.
MM-Disables Persona3.1 MLoc3.2.
User4 (Participant) MM-Embeds Persona(AV)4.1 MLoc4.1.
MM-Disables Persona4.1 MLoc4.2.

11.7.5        Actions, Items, and Data Types

 

Table 14 – Virtual Event Actions, Items, and Data Types.

Actions Items Data Types
Author Object (A) Amount
Interpret Object (AV) Coordinates
MM-Disable Persona (AV) Currency
MM-Embed M-Location Orientation
MM-Send Value Participants Status
Track Position
Transact Spatial Attitude

11.8        AR Tourist Guide

11.8.1        Description

In his Use Case human3 engages the following humans:

  1. human1 to cause their User1 to buy a virtual parcel and develop a virtual landscape suitable for a tourist application.
  2. human2 to cause their User2 to develop scenes and autonomous agents for the different places of the landscape.
  3. human4 to create an app that alerts the holder of a smart phone where the app is installed.
  4. human5 holding a smart phone with the app to perceive Entities and interact with Personae MM-Embedded at M-Locations and MM-Animated by autonomous agents (AA).

11.8.2        Processes, Items, and Locations

 

User1 Land developer
Object(V)1.1 Landscape MLoc1.1 parcel
Value1.1 Payment for MLoc1.1 MLoc1.1
User2 Object developer
Objects(AV)2.i Objects for landscape MLoc2.i Key ULoc twin
Value2.1 For Objects(AV)2.i-Object(AV)1.1-Objects(AV)2.i
User3 Tourist application developer
Persona3.k Persona to be MM-Animated MLoc3.k Key ULoc twin
human4 Software developer
Map
Value4.1 For Map and App
human5 human holding Device with App
Device1 Held by human5 ULoc5.1
App1 Installed in Device1
Message5.1 From App5 to Device5

11.8.3        Detailed workflow

  1. User1
    • Buys MLoc1 (parcel) in an M-Environment.
    • Authors Object(V)1 (landscape suitable for a virtual path through n sub-MLocs).
    • Embeds Object(V)1 (landscape) at MLoc1.1 (parcel).
    • Sells Object(V)1 (landscape) and MLoc1.1 (parcel) to a User2.
  2. User2
    • Authors n Object(AV)i for the MLocs.
    • MM-Embeds Object(AV)i at MLoc2.i (n places)
    • Sells parcel + landscape + n Object(AV)i.
  3. human4
    • Develops
      • Map recording the pairs MLoci – U-Loc2.i
      • App alerting human5 that they have reached a key U-Loc.
    • Sells Map and App to human3.
  4. User3 MM-Embeds Persona(AV)j at n MLoc3.j places.
  5. human5 reaches key U-Loc1 corresponding to MLoc2.k.
  6. App1 MM-Sends Message1 to Device1.
  7. Device1
    • MM-Sends Message1 to User3.
  8. User3
    • MU-Renders Object(AV)k MM-Embedded at MLoc2.k at U-Loc5.1.
    • MU-Animates Persona(AV)k.
    • MU-Renders Persona(AV)k at U-Loc5.1.

11.8.4        Workflow

 

Table 15 – AR Tourist Guide workflow.

Who Does What Where/comment
User1 Transacts Value1.1 MLoc1.1’s parcel.
Authors Object(V)1.1 (landscape of MLocs).
Embeds Object(V)1.1 (Parcel).
Transacts Value1.1 User2 (MLoc1.1 & Object(V)1.1
User2 Authors Object(AV)2.1 to Object(AV)2.n (to be MM-Embedded)
Embeds Object(AV)2.1 to Object(AV)2.n MLoc2.1-2.n
Transacts2.1 Value2.1 User3 (all Object(AV)2.1-2.n)
human4 develops Map Of MLocs & ULocs
develops App
sells Map and App To human3.
User3 MM-Embeds Personae MLoc2.1-2.n.
MM-Animates Personae MLoc2.1-2.n.
human5 comes to key U-Loc2.i.
App5.1 MM-Sends Message5.1 Device1
Device5.1 MM-Sends Message5.1 User3
User3 MU-Renders Entity5.1 @MLoc5.1 At key U-Loc5.1.
MM-Animates Persona(AV)2.k At key U-Loc5.1.
MU-Renders MM-Animated Persona5.2 At key U-Loc5.1.

11.8.5        Actions, Items, and Data Types

 

Table 16 – AR Tourist Guide Actions, Items, and Data Types.

Actions Items Data Types
Author Object(AV) Amount
Author Object(V) Coordinates
MM-Animate Map Currency
MM-Animate Message Orientation
MM-Embed M-Location Position
MM-Send Persona Spatial Attitude
MU-Render Service
MM-Send U-Location
Transact Value

11.9        Virtual Dance

11.9.1        Description

This Use Cases envisages that:

  1. Dance teacher places in the dance school a virtual secretary animated by an autonomous agent.
  2. Student #1:
    • Shows up at school.
    • Greets the secretary.
  3. Virtual secretary reciprocates greetings.
  4. Dance teacher:
    • Places a haptic Persona of theirs in the dance school.
    • Dances with student #1.
  5. Student #2:
    • Is at home.
    • Shows up at school.
  6. The teacher:
    • Places the haptic Persona close to student #2.
    • Places (replaces) another haptic Persona of theirs.
    • Animates the new haptic Persona with autonomous agent dancing with student #1.
    • Dances with student #2.

11.9.2        Processes, Items, and Locations

 

User1 Dance teacher
Persona(AVH)1.1 Dancing persona MLoc1.1 Teacher’s Office
Persona(AVH)1.2 Virtual Secretary MLoc1.2 Dance School
Persona(AVH)1.3 Another dancing persona MLoc1.3 Dance place
Object(A)1.1 Response to greetings MLoc1.4 Another dance place
User2 Dance student #1
Persona(AVH)2.1 Student’s Persona MLoc2.1 Student’s home
MLoc2.1 Place in dance school
Object(A)2.1 Student’s greetings
User3
Persona(AVH)3.1 Student’s Persona MLoc3.1 Student’s home
  MLoc3.1 Place in dance school

11.9.3        Detailed description

  1. User1 (dance teacher)
    • Tracks Persona(AVH)1 at MLoc1.1
    • MM-Embeds Persona(AV)2 (another of its Personae) at MLoc1.2.
    • MM-Animates Persona(AV)2 (as virtual secretary attending to students coming to school).
  2. User2 (dance student #1):
    • MM-Embeds its Persona(AVH)1 at MLoc2.1 (its “home”).
    • MM-Embeds Persona(AVH)1 at MLoc2.2 (close to virtual secretary).
    • MM-Sends Object(A)1 to Persona1.2 (greets virtual secretary).
    • MM-Disables Persona(AVH)1 from MLoc2.1.
  3. User1 (Persona(AVH)2):
    • MM-Sends Object(A)1 (to student #1 reciprocating greetings).
    • MM-Send Object(A)2 (calling teacher’s Persona1.1).
  4. Dance teacher (Persona(AVH)1):
    • MM-Embeds Persona(AVH)1 at MLoc1.3 (classroom).
    • UM-Animates Persona(AVH)1 (dances with student #1).
  5. While Persona(AVH)1 (student #1) and Persona(AVH)1.1 (teacher) dance, User3 (dance student #2):
    • MM-Embeds Persona(AVH)1 at MLoc3.1 (its “home”).
    • MM-Embeds Persona(AVH)1 at MLoc3.2 (place in classroom).
    • MM-Disables Persona(AVH)1 from MLoc3.1.
  6. After a while, User1 (dance teacher):
    • MM-Embeds Persona(AVH)1 at MLoc1.4 (close to student #2’s position).
    • MM-Disables Persona(AVH)1 (from where it was dancing with student #1).
    • MM-Embeds Persona(AVH)3 at MLoc1.3.
    • MM-Animates Persona(AVH)3 with autonomous agent (to dance with student #1).
    • UM-Animates Persona(AVH)1 at MLoc3.2 (dances with student #2).

11.9.4        Workflow

 

Table 17 – Virtual Dance workflow.

User1 (teacher) Tracks Persona(AVH)1.1 MLoc1.1
MM-Embeds Persona(AV)1.2 MLoc1.2.
MM-Animates Persona(AV)1.2 (As VS for students).
User2 (student1) Tracks Persona(AVH)2.1 MLoc2.1 (its “home”).
MM-Embeds Persona(AVH)2.1 MLoc2.2 (close to VS).
MM-Sends Object(A)2.1 Persona(AVH)1.2 (greets VS).
MM-Disables Persona(AVH)2.1 from MLoc2.1.
User1 (Persona1.1) MM-Sends Object(A)1.1 (Responds to student #1).
User1 (Persona1.2) MM-Embeds Persona(AVH)1.2 MLoc2.3 (classroom).
User1 (Persona1.1) UM-Animates Persona(AVH)1.1 (Dances with student #1).
User3 (student2) Tracks Persona(AVH)3.1 MLoc3.1 (its “home”).
MM-Embeds Persona(AVH)3.1 MLoc3.2 (close to VS).
MM-Disables Persona(AVH)3.1 from MLoc3.1.
User1 (teacher) MM-Embeds Persona(AVH)1.1 MLoc1.4 (near student2).
MM-Disables Persona(AVH)1.1 From MLoc2.2.
MM-Embeds Persona(AVH)1.3 At MLoc2.2.
MM-Animates Persona(AVH)1.3 (w/ AA with student #1).

11.9.5        Actions, Items, and Data Types

 

Table 18 – Virtual Dance Actions, Items, and Data Types.

Actions Items Data Types
MM-Animate M-Location Orientation
MM-Disable Object (A) Position
MM-Embed Persona (AV) Spatial Attitude
MM-Send Persona (AVH)
Track

11.10   Virtual Car Showroom

11.10.1    Description

This Use Cases envisages that:

  1. A car dealer MM-Embeds an MM-Animated Persona in the car showroom (as attendant).
  2. A customer:
    • MM-Embeds its Persona in the car showroom.
    • Greets the showroom attendant.
  3. The Showroom attendant reciprocates the greeting.
  4. The dealer:
    • UM-Animates the attendant.
    • Converses with the customer.
    • Embeds a 3D AVH model of a car.
  5. The customer
    • Has a virtual test drive.
    • Buys the car.
    • Returns home.

11.10.2    Processes, Items, and Locations

 

User1 Car dealer
Persona(AV)1.1 Car dealer MLoc1.1 Car dealer’s Office
Persona(AV)1.2 Attendant MLoc1.2
Object(A)1.1 Response to greetings M-Loc1.3 Place in the showroom
User2 Customer
Persona(AV)2.1 Customer’s Persona M-Loc2.1 Customer’s home
Object(A)2.1 Greetings M-Loc2.2 Place in showroom
Persona(AVH)2.1 User2’s Persona in test driving M-Loc2.3 Location of virtual car
Value2.1 Payment for car

11.10.3    Detailed workflow

  1. User1 (car dealer):
    • Tracks Persona(AV)1 at M-Loc1.1 (“office”).
    • MM-Embeds Persona(AV)2 at M-Loc1.2 (“showroom”) with SA.
    • MM-Animates Persona(AV)2.
  2. User2 (customer):
    • Tracks Persona(AV)1 at M-Loc2.1 (“home”).
    • MM-Embeds Persona(AV)1 at M-Loc2.2 (“in the showroom”).
    • MM-Sends Object(A)1 to Persona1.2 (greets showroom attendant).
    • MM-Disables Persona(AV)1 at M-Loc2.1 (“home”).
  3. User1 (Persona(AV)2):
    • MM-Sends Object(A)1 to Persona2.1 (responds to greetings).
  4. User1 (Persona(AV)1)
    • MM-Embeds Persona(AV)1 at M-Loc1.3 (“in the showroom”).
    • MM-Sends Object(A)2 to Persona2.1 (engages in conversation).
    • MM-Embeds Model(AVH)1 at M-Loc1.4 (model car “in the showroom”).
    • MM-Animates Model(AVH)1 (“animate model car”).
  5. User2 (customer)
    • MM-Embeds Persona(AVH)1 at M-Loc2.3 (location of virtual car).
    • UM-Animates Persona(AVH)1.
    • Transacts Value1 (buys car).
    • MM-Disables Persona(AVH)1 at M-Loc1.3.
    • MM-Embeds Persona(AV)1 at M-Loc2.1 (“at home”).

11.10.4    Workflow

Table 19 – Virtual Car Showroom workflow.

User1 (car dealer) Tracks Persona(AV)1.1 M-Loc1.1 (“office”).
MM-Embeds Persona(AV)1.2 M-Loc1.2 (“showroom”) w/ SA1.1
MM-Animates Persona(AV)1.2 (Showroom attendant).
User2 (customer) Tracks Persona(AV)2.1 M-Loc2.1 (“home”).
MM-Embeds Persona(AV)2.1 M-Loc2.1 (“showroom”).
MM-Sends Object(A)1.1 Persona1.2 (greets attendant).
MM-Disables Persona(AV)2.1 M-Loc2.1 (“home”).
User1 (Persona1.2) MM-Sends Object(A)1.1 Persona2.1 (responds to greetings).
MM-Sends Object(A)1.2 Persona1.1 (“attend customer”).
User1 (Persona1.1) MM-Embeds Persona(AVH)1.1 M-Loc1.3 (“showroom”).
MM-Sends Object(A)1.2 Persona2.1 (converses).
MM-Embeds Model(AVH)1.1 M-Loc1.4 (“in showroom”).
MM-Animates Model(AVH)1.1 (“Animate model car”).
User2 (customer) MM-Embeds Persona(AVH)2.1 M-Loc2.3 (in virtual car)
UM-Animates Persona(AVH)2.1 (Drives virtual car)
Transacts Value2.1 (Buys car).
MM-Disables Persona(AVH)2.1 M-Loc1.3.
MM-Embeds Persona(AV)2.1 M-Loc2.1 (“at home”).

11.10.5    Actions, Items, and Data Types

 

Table 20 – Virtual Car Showroom Actions, Items, and Data Types.

Actions Items Data Types
MM-Animate Object (A) Amount
MM-Disable Persona(AV) Currency
MM-Embed Persona(AVH) Orientation
MM-Send Scene (AVH) Position
Track Value Spatial Attitude
Transacts
UM-Animate

11.11   Drive a Connected Autonomous Vehicle

11.11.1    Description

This Use Case considers some of the steps made by a human having rights to an implementation of Technical Specification: Connected Autonomous Vehicle (MPAI-CAV) – Architecture [6]. Chapter 7 of Annex 1 – MPAI Basic provides a high-level summary of the specification.

 

A CAV rights holder Registers with the CAV to access the CAV-created M-Instance by providing:

  1. The requested subset of their Personal profile.
  2. Two User Processes required to operate a CAV:
    • User1 to operate the Human-CAV Interaction Subsystem.
    • User2 to operate the Autonomous Motion Subsystem.
  3. User1’s Personae.

 

For simplicity, the Use Case assumes that there are two CAVs: CAVA and CAVB and that the CAVA rights holder (UserA.1) wants to see the CAVB Environment in the CAVB M-Instance:

  1. User1
    • Authenticates the human’s voice.
    • Interprets driving instructions from human.
    • Communicates driving instructions to User2.
  2. User2
    • Gets information about CAVA
    • Gets travel options from Route Planner.
    • Communicates travel options to User1.
  3. User1
    • Produces Speech Object with travel options.
  4. human utters selected option to User1.
  5. User1
    • Interprets driving instructions from human.
    • Communicates driving instructions to User2.
  6. User2
    • Gets the Basic Environment Representation from its ESS.
    • Authenticates its peer User2.
    • Gets elements of the Basic Environment Representation from User2.
    • Produces Full Environment Representation.
    • Sends a command to the Ego CAV’s Motion Actuation Subsystem.
  7. User1
    • Authenticates its peer User2.
    • Watches CAVB’s Environment.

11.11.2    Processes, Items, and Locations

 

UserA.1 CAVA’s HCI
humanA CAVA’s rights holder MLocA.1.1 Corresponding to MLocA.1.1
Object(A)A.1.1 human utterance #1 ULoc A.1.1 Close to CAVA
Object(A)A.1.2 UserA.1’s utterance #1 MLocA.1.2 Inside CAVA
HCI-AMSCommandA.1.1 Travel request
Object(A)A.1.3 UserA.1’s utterance #2
Object(A)A.1.4 human utterance #2
HCI-AMSCommandA.1.2 Travel selection
Request-AuthenticateA.1.1 UserB.1 Authentication
Response-AuthenticateA.1.1 UserB.1 Authentication
UserA.2 CAVA’s AMS
AMS-HCIResponseA.2.1 Route selection SceneA.2.1 CAVA’s Environment
Request-AuthenticateA.2.1 UserB.2 Authentication
Response-AuthenticateA.2.1 UserB.2 Authentication
UserB.2 CAVB’s AMS SceneB.2.1 CAVB’s Environment
UserB.1 CAVB’s HCI

11.11.3    Detailed workflow

  1. humanA Registers with CAVA.
  2. User1
    • Tracks Persona1.1 at M-LocA.1.1 (connects CAVA’s M-Loc A.1.1 with U-Loc A.1.1).
    • Authenticates Object(A)1.1 (humanA’s request to travel).
    • Interprets Object(A)1.1.
    • MM-Sends HCI-AMSCommand1.1 to UserA.2.
  3. User2
    • MM-Sends ESS’s Scene2.1 to RoutePlanner.
    • MM-Sends Route2.1 to UserA.1.
  4. User1
    • MU-Renders Object(A)1.2 (to humanA).
    • UM-Renders Object(A)1.3 (humanA’s Route selection).
    • Interprets Object(A)1.3 (understand Route).
    • MM-Sends HCI-AMSCommand1.2 to UserA.2.
  5. User2
    • Authenticates User2.
    • MM-Sends
      • ESS’s Scene2.2 to Environment Representation Fusion (ERF).
      • Scene2.3 at M-LocA.2.1 (in CAVB’s M-Instance) to ERF.
      • Path2.1 to Motion Planner.
      • Trajectory2.1 to Obstacle Avoider.
      • Trajectory2.1 to Command Issuer.
      • AMS-MASCommand2.1 to Motion Actuation Subsystem.
      • MAS-AMSResponse2.1.
  1. User1
    • Authenticates User2.
    • MM-Sends Scene(AV)1.4 (CAVB’s Environment) to UserA.1.

11.11.4    Workflow

Table 21 – Drive a Connected Autonomous Vehicle workflow.

Who Does What Where/(comment)
humanA Registers (With CAVA).
UserA.1 Authenticates ObjectA.1.1(AV) (Recognises humanA’s voice).
Interprets ObjectA.1.1(A) (humanA’s request to go).
MM-Sends HCI-AMSCmdA.1.1 UserA.2.
UserA.2 MM-Sends ESS’s SceneA.2.1 Route Planner.
MM-Sends AMS-HCIRespA.2.1 RouteA.2.1 to UserA.1
UserA.1 MU-Renders ObjectA.1.2 (A) (To humanA).
UM-Renders ObjectA.1.3 (A) (Route selection).
Interprets ObjectA.1.3 (A) (Understand Route).
MM-Sends HCI-AMSCmdA.1.2 UserA.2
UserA.2 Authenticates UserB.2
MM-Sends ESS’s SceneA.2.2 (To ERF).
PathA2.1 Motion Planner.
TrajectoryA.2.1 Obstacle Avoider.
TrajectoryA.2.1 Command Issuer.
AMS-MASCmdA.2.1 MAS.
MAS-AMS RespA.2.1. From MAS.
UserA.1 Authenticates UserA.2.
MM-Sends SceneA.1.1 CAVB’s Environment.

11.11.5    Actions, Items, and Data Types

Note: The MPAI-CAV specific Items are included.

 

Table 22 – Drive a Connected Autonomous Vehicle Actions, Items, and Data Types.

Action Item Data Types
Authenticate AMS-HCIResponse Spatial Attitude
Interpret AMS-MASCommand Coordinates
MM-Embed Environment Representation Orientation
MM-Send HCI-AMSCommand Position
MU-Render MAS-AMSResponse
Register M-Location
Request Object (A)
Track Path
UM-Render Persona
Route
Scene
Trajectory

 

 

<– Functional Requirements of Data Types   Go to ToC    Functionality Profiles->