(Informative)
In an online Authoritative Multiplayer Game, each player uses a client to send control data to a server. The server updates the current Game State with the data from all clients and then broadcasts it to all clients. The data originating from a client may be properly or maliciously generated and properly received or not received at all. In both cases, the Game State received from the server does not describe a correct and consistent situation.
Among the most widespread game network architectures, Authoritative Servers maintain consistency among all connected clients (i.e., the game instances executed locally by players) acting as the central arbiter and controller of the Game State.
Authoritative server architectures are not immune to the challenges posed by network problems, especially Latency, i.e., the delay incurred by data transmitted by a player instance to the game server, because it can disrupt the seamless flow of gameplay.
There are several techniques [1] currently used to cure this situation. In Client Prediction [2, 3],client game state is updated locally using predicted or interpolated data while waiting for the server data; in Time Delay [4, 5, 6], the server buffers the game state updates to synchronise all clients; and in Time Warp [7, 3, 6] the server rolls back the game state to when controller data was sent by a client and acts as if the action was taken then, reconciling this new game state with the current game state. These methods have shortcomings. Client Prediction causes perceptible delay, Time Delay affects responsiveness, and Time Warp disadvantages other players because the new game state likely differs from the previous one.
Technical Report: Server-based Predictive Multiplayer Gaming (MPAI-SPG) – Mitigation of Data Loss Effects (SPG-MDL) V1.0 describes the steps and a methodology that involves server-level prediction techniques. Specifically, the server uses a predictive model to forecast player actions based on historical data and the current Game State when Latency-affected clients are detected. These predictions are then shared with all clients, ensuring a continuous and unified gaming experience. The approach leverages Machine Learning (ML) algorithms, an area of research that is only recently being explored in the context of Latency mitigation strategies for online multiplayer games. Furthermore, the server could also identify possible cheating attempts by comparing predictions with the current Game State, especially when clients have greater control over their local instances (i.e., client prediction).
Capitalised Terms used in all Chapters and Sections are defined in Table 1 if they are specific to this Technical Specification. All MPAI-defined Terms are accessible online.