1. Definition 2. Functional Requirements 3. Syntax 4. Semantics

1. Definition

The focused status data obtained by observing the individual and collective behaviour of Participants in the Real Environment by interpreting the descriptors, per the current Cue Point, including behaviour, Emotion, Cognitive State, and Social Attitude, as needed by:

  1. The Action Generation AIM to service the current cue point.
  2. The Cue Point Identification AIM to trigger the next cue point.

Components from the Real Environment as data describing relevant Participants in the Real Environment.

2. Functional Requirements

  1. Gesture descriptors:
    1. Raising arms
    2. Waving
    3. Jumping
    4. Pointing to a direction
    5. Dancing
  2. Visual activity
    1. Hands waving left to right/right to left
    2. Participants standing or sitting
    3. Participants clapping
  3. Social clustering:
    1. Coordinates of cluster centroids.
    2. Variances along the three principal axes.
    3. Percentage of total Participants in each cluster.
    4. Identity of individual Participant within each cluster
    5. Distance of individual Participant from the centroid.
  4. Objects in field of vision and gaze direction:
    1. List of objects/performers either present or represented in the Real Environment or their components that are being observed (granularity of target set by Script).
    2. Percent of participants observing a particular object/performer/component.
  5. Participant audio activity
    1. Number of participants doing a certain activity per area/zone:
      1. Laughing, Clapping, Booing, Shouting, Singing.
      2. Uttering a Text.
    2. Participant intensity of the activity per avatar per area/zone.
    3. Number of Participants uttering a particular phrase/text per area/zone.

3. Syntax

4. Semantics

Label Size Description
Header N1 Bytes Header
– Standard- 9 Bytes The characters “XRV-RTS-V”
– Version N2 Bytes Major version – 1 or 2 characters
– Dot-separator 1 Byte The character “.”
– Subversion N3 Bytes Minor version – 1 or 2 characters
MInstanceID N4 Bytes Identifier of M-Instance.
ParticipantStatusID N5 Bytes Identifier of Participant Status.
SpaceTime N6 Bytes  Space-Time info of Participant Status.
GestureAttributes N7 Bytes one of: Raising arms, Waving, Jumping, Pointing to a direction, Dancing.
Visual Activity N8 Bytes
– HandsWaving N9 Bytes Hands waving left to right/right to left
– StandOrSit N10 Bytes Participants standing or sitting
– Clapping N11 Bytes Participants clapping
SocialClustering[] N12 Bytes Social Clustering related data.
– ClusterID N13 Bytes ID of Cluster
– Participants[] N14 Bytes Data of Participants in Cluster
  – ParticipantID N15 Bytes ID of Participant
  – ParticipantIDPosition N16 Bytes Position of Participant
– NoOfClusterParticipant N17 Bytes Number of Participants in Cluster
– ClusterCentroidCoord N18 Bytes Coordinates of ClusterCentroid
GazeDirection N19 Bytes Data relate to Participants’ gaze.
– Objects[] N20 Bytes Present/Represented in RE or their components that are being observed
  – ObjectID N21 Bytes ID of individual Object
– Entities[] N22 Bytes Present/represented in RE or their components that are being observed
  – EntityID N23 Bytes ID of individual Object
– Targets N24 Bytes Objects/performers/components being observed.
  – TargetID N25 Bytes Specific object/performer/component being observed.
  – Location N26 Bytes Location including Target.
  – Percentage N27 Bytes Percent of participants observing TargetID
– AudioAttributes N28 Bytes oneOf: Speaking, Laughing, Clapping, Booing, Shouting, Singing
– UtteredSpeech N29 Bytes By Participants
– ActivityByLocation[] N30 Bytes
  – LocationID N31 Bytes
  – ParticipantIntensity N32 Bytes
  – Text N33 Bytes