GameStory: The 2019 Video Game Analytics Challenge

Task description
In this task, participants analyze multi-view multimedia data captured at a Counter-Strike: Global Offensive event. The data includes sources such as audio and video streams, commentaries, game data, as well as statistics, interaction traces, and viewer-to-viewer communication. We ask participants to develop systems capable of multi-stream synchronization, replay detection, and ultimately: summarization towards a GameStory.

Task motivation and background
That video and computer games have reached the masses is a well known fact. Furthermore, game streaming and watching other people play video games is another phenomenon that has outgrown its small beginning by far, and game streams, be it live or recorded, are today viewed by millions. E-sports is the result of organized leagues and tournaments in which players can compete in controlled environments and viewers can experience the matches, discuss and criticize, just like in physical sports. Already in 2013 concurrent users for a single event exceeded eight million for a League of Legends Championship. In 2018 approximately 222 million viewers viewed e-sports streams frequently. E-sports broadcasting and streaming offers a rich bouquet of data including audio and video streams, commentaries, game data and statistics, interaction traces, viewer-to-viewer communication, and many more channels. However, as traditional sports, e-sport matches may be long and contain less interesting parts, introducing the challenge of producing well directed summaries and highlights.

One of the key challenges today is to render the results of e-sports events more promotable. It is notoriously difficult to search, e.g., for exciting highlights, because of the huge amounts of video recorded for each match and their homogeneity: for example, every match of League of Legends roughly looks the same. A lot depends on the manual selection and curation of supplementary material for promotion on an event’s website. This way, individual matches of e-sports events can be more easily accessed by those attending on-site or by stream, however, at the cost of significant manual overhead. At the same time, these efforts also enable the event and its highlights to be retrieved or recommended later on.


Training and test data for the task in 2018 have been provided in cooperation with ZNIPE.TV, which is a platform for e-sport streaming. The data will be enriched with a ground truth on stream synchronization and event detection. Evaluation is two-fold. On the one hand we evaluate the submissions against the ground truth with objective measures for synchronization and event detection, on the other hand we will evaluated summary submissions by an expert panel. The expert panel will include professionals from the game industry, amateur players and frequent viewers, and players and researchers from the field of game engineering, game studies and narratives in video games. The exact criteria for evaluating submissions will be available to the participants within the in-depth task description.

Target group
The target groups of researchers for this task is diverse and broad. It includes researchers from multimedia, web science, data science, machine learning, information retrieval, natural language processing, and game and entertainment-related fields. We also encourage interdisciplinary research involving people from the media sciences, the cultural sciences, and the humanities discussing what would be part of such a story and what the expectations of the audience are. There are multiple angles from which one can approach this task, namely classical video and audio summaries, natural language generation, multimedia analysis, emotion recognition, non-linear stories, multimodal data analysis, user interaction analysis, retrieval and classification of relevant segments, multi-view videos, weighting and ordering of events and video segments, and many more. In any case, regardless the research background, it will help to have a basic understanding of the gaming culture, or ideally, to enlist dedicated gamers.

Training and test data for the task are provided in cooperation with ZNIPE.TV. The video and event streams available are covering a Counter-Strike: Global Offensive, short CS:GO, tournament in the ESL One league from Katowice in 2018. All of the matches have video streams from the 10 gamers (five per team), a spectator video stream with commentator audio, a video stream with the map overview, structured data on game statistics and game highlights.

Ground truth and evaluation
More information available soon.

Recommended reading
[1] J. Bryce & J. Rutter, 2002. "Spectacle of the Deathmatch: Character and Narrative in First Person Shooters" in G. King & T. Krzywinska (Ed.s), ScreenPlay: Cinema/videogames/interfaces, Wallflower Press, pp.66-80
[2] Shah, Rajiv Ratn, et al. "Eventbuilder: Real-time multimedia event summarization by visualizing social media." Proceedings of the 23rd ACM international conference on Multimedia. ACM, 2015.
[3] Lee, Wen-Yu, Winston H. Hsu, and Shin’ichi Satoh. "Learning From Cross-Domain Media Streams for Event-of-Interest Discovery." IEEE Transactions on Multimedia 20.1 (2018): 142-154.
[4] M. Seif El-Nasr, A. Drachen, A. Canossa (Eds.). “Game Analytics”. Springer, 2013.
[5] Lux, M., Riegler, M., Dang-Nguyen, D. T., Larson, M., Potthast, M., & Halvorsen, P. (2018). GameStory Task at MediaEval 2018. In Working Notes Proceedings of the MediaEval 2018 Workshop.

Task organizers
Duc-Tien Dang-Nguyen, duc-tien.dang-nguyen at, Dublin City University
Marcus Larson, ZNIPE.TV
Martin Potthast, Universität Leipzig
Mathias Lux, mlux at, Alpen-Adria-Universität Klagenfurt
Michael Riegler, SIMETRIC & University of Oslo, michael at
Pål Halvorsen, Simula Research Laboratory & University of Oslo

Task schedule
Development data release: 31 May
Test data release: 15 June
Runs due: 20 September
Results returned: 23 September
Working Notes paper due: 30 September
MediaEval 2019 Workshop (in France, near Nice): 27-29 October 2019