1. Definition 2. Functional Requirements 3. Syntax 4. Semantics

1. Definition

The focused status data obtained by interpreting the individual and collective behaviour of Participants in the Real Environment by interpreting the Descriptors, per the current Cue Point as needed by the Action Generation AIM to service the current cue point.

2. Functional Requirements

  1. Collective clustering:
    1. Spatial info
      1. Coordinates of cluster centroids.
      2. Variances along the three principal axes.
      3. Percentage of total Participants in each cluster.
      4. Identity of individual Participant within each cluster
      5. Distance of individual Participant from the centroid.
    2. Cluster property
      1. Raising arms
      2. Stadium wave
      3. Jumping
      4. Pointing to a direction
      5. Dancing
      6. Hands waving left to right/right to left
      7. Participants standing or sitting
      8. Participants clapping
      9. Laughing, Clapping, Booing, Shouting, Singing.
      10. Uttering a Text
      11. Relevant objects in gaze direction per the Script.
  2. Individual Status
    1. Participant ID
    2. Participant bevaviour:
      1. Raising arms
      2. Jumping
      3. Pointing to a direction
      4. Dancing
      5. Hands waving left to right/right to left
      6. standing or sitting
      7. Clapping
      8. Laughing, Clapping, Booing, Shouting, Singing.
      9. Uttering a Text
      10. Raising arms
      11. Waving
    3. Behaviour Intensity
    4. Relevant objects in gaze direction per the Script.

3. Syntax

4. Semantics

Label Size Description
Header N1 Bytes Header
– Standard- 9 Bytes The characters “XRV-RTS-V”
– Version N2 Bytes Major version – 1 or 2 characters
– Dot-separator 1 Byte The character “.”
– Subversion N3 Bytes Minor version – 1 or 2 characters
MInstanceID N4 Bytes Identifier of M-Instance.
ParticipantStatusID N5 Bytes Identifier of Participant Status.
SpaceTime N6 Bytes  Space-Time info of Participant Status.
GestureAttributes N7 Bytes one of: Raising arms, Waving, Jumping, Pointing to a direction, Dancing.
Visual Activity N8 Bytes
– HandsWaving N9 Bytes Hands waving left to right/right to left
– StandOrSit N10 Bytes Participants standing or sitting
– Clapping N11 Bytes Participants clapping
SocialClustering[] N12 Bytes Social Clustering related data.
– ClusterID N13 Bytes ID of Cluster
– Participants[] N14 Bytes Data of Participants in Cluster
  – ParticipantID N15 Bytes ID of Participant
  – ParticipantIDPosition N16 Bytes Position of Participant
– NoOfClusterParticipant N17 Bytes Number of Participants in Cluster
– ClusterCentroidCoord N18 Bytes Coordinates of ClusterCentroid
GazeDirection N19 Bytes Data relate to Participants’ gaze.
– Objects[] N20 Bytes Present/Represented in RE or their components that are being observed
  – ObjectID N21 Bytes ID of individual Object
– Entities[] N22 Bytes Present/represented in RE or their components that are being observed
  – EntityID N23 Bytes ID of individual Object
– Targets N24 Bytes Objects/performers/components being observed.
  – TargetID N25 Bytes Specific object/performer/component being observed.
  – Location N26 Bytes Location including Target.
  – Percentage N27 Bytes Percent of participants observing TargetID
– AudioAttributes N28 Bytes oneOf: Speaking, Laughing, Clapping, Booing, Shouting, Singing
– UtteredSpeech N29 Bytes By Participants
– ActivityByLocation[] N30 Bytes
  – LocationID N31 Bytes
  – ParticipantIntensity N32 Bytes
  – Text N33 Bytes