<-Foreword       Go to ToC       Scope->

(Informative)

In an online Authoritative Multiplayer Game, each player uses a client to send control data to a server. The server updates the current Game State with the data from all clients and then broadcasts it to all clients. The data originating from a client may be properly or maliciously generated and properly received or not received at all. In both cases, the Game State received from the server does not describe a correct and consistent situation.

Among the most widespread game network architectures, Authoritative Servers maintain consistency among all connected clients (i.e., the game instances executed locally by players) acting as the central arbiter and controller of the Game State. By processing data received from all players, the server computes a new Game State, which is then distributed to all clients. This model ensures game progress consistency and gameplay integrity across all clients.

Authoritative server architectures are not immune to the challenges posed by network problems, especially Latency, i.e., the delay incurred by data transmitted by a player instance to the game server. Latency is particularly critical because it can disrupt the seamless flow of gameplay, introducing inconsistencies that can adversely affect the gaming experience.

Many network games employ a technique called Client Prediction, in which the client updates the game using predicted data while waiting for a response from the server. The Client uses a prediction system or interpolates previous server information to replace the missing data. The player will not perceive any Latency until communication with the server is restored and the Client’s Game State is synchronised with the server. This solution is acceptable for the player experiencing network delays, but all the other clients will still perceive the Latency of that player. On the server, the state of the player experiencing Latency is frozen until new information is received.

Latency has two main consequences. First, the players experiencing a disruption in responsiveness, requiring them to wait for server responses to update their local Game State (GS), which leads to perceptible delays, disrupting the smooth experience of gameplay. Second, as the server receives delayed Client Data (CD), the Game State on the server becomes inconsistent, compromising the game experience for the clients unaffected by Latency.

As described in the Introduction, the most used technique for latency mitigation is Client Prediction. It reduces the Latency perceived by the affected players, but the unaffected ones still receive an inconsistent Game State from the server due to missing data from one client. To tackle this issue, a well-known method is Time Delay [1]. This technique buffers Game State updates to synchronise all clients, fostering a more uniform gaming experience. While Time Delay has demonstrated its effectiveness in eliminating state inconsistencies [2], [3], it is also acknowledged that this approach can result in decreased responsiveness [4].

Another common solution is Time Warp. When the server receives messages from a client, it analyses the impact of that player’s actions by reverting the Game State to the one perceived by the client when the message was sent. However, there are instances where the new Game State may differ significantly from the previous one, resulting in an unfair disadvantage for other players.

This document describes the steps and illustrates the application of “Server-based Predictive multiplayer Gaming (SPG)”, a methodology that involves server-level prediction techniques. Specifically, the server uses a predictive model to forecast player actions based on historical data and the current Game State when Latency-affected clients are detected. These predictions are then shared with all clients, ensuring a continuous and unified gaming experience, even when real-time data is not available at the expected time. The approach leverages Machine Learning (ML) algorithms, an area of research that is only recently being explored in the context of Latency mitigation strategies for online multiplayer games. Furthermore, the server could also identify possible cheating attempts by comparing predictions with the current Game State, especially when clients have greater control over their local instances (i.e., client prediction).

In all Chapters and Sections, capitalised Terms beginning Table 1 if they are specific to this Technical Specification. All MPAI-defined Terms are accessible online.

 

<-Foreword       Go to ToC       Scope->