1 week, 4 days from now

IEEE BigData 2021 Cup: Predicting Victories in Video Games

Predicting Victories in Video Games is a data mining challenge organized in association with IEEE BigData 2021 (http://bigdataieee.org/BigData2021/) conference. The task is to predict winners in Tactical Troops: Anthracite Shift (http://tacticaltroops.net), based on game logs. The competition is sponsored by QED Software (http://qed.pl/).

Overview

Emotions are part of the fun when you play games. One of the greatest excitements is the prediction of who is going to win. QED Software (qed.pl) comes up with the challenge which is related to the Tactical Troops: Anthracite video game (released in April 2021, tacticaltroops.net/) and organized in association with the 2021 IEEE International Conference on Big Data (IEEE BigData 2021).

Find a way to predict victories by analyzing logs from the game! More details regarding the task and the description of the challenge data sets can be found in the Task description section.

Special session at IEEE BigData 2021: A special session devoted to the challenge will be held at the IEEE BigData 2021 conference. We will invite authors of selected challenge reports to extend them for publication in the conference proceedings (after reviews by Organizing Committee members) and presentation at the conference. The publications will be indexed in the same way as regular conference papers. The invited teams will be chosen based on their final rank, innovativeness of their approach, and quality of the submitted report.

 

Terms & Conditions
 
 
Please logIn to the system!

The task in this challenge is to come up with an efficient algorithm for predicting winners in the Tactical Troops: Anthracite Shift (http://tacticaltroops.net), based on game log data. The available data sets correspond to multiplayer games, played on a variety of maps. In particular, we provide data in three formats, differing in the granularity of game descriptions. Detailed descriptions of each data format are available in separate files in the Data files section.

Tabular format: The simplest representation of played games consists of an aggregated vector description of the situation on a map. Games in this representation are stored in the tabular format, as two CSV files, i.e., training_data_tabular_format.csv and test_data_tabular_format.csv. Each row in those tables corresponds to a single game with an identifier in the id column. Files containing other data representations (in different formats) follow this identifier as a file name. A detailed description of this format can be found in the Representation_1.rtf file.

Flattened logs format: This representation format is more granular than the Tabular Format, yet simpler to parse than the Truncated Logs format. In contrast to the latter, this format represents only the current state of the game, when the prediction is to be made. Each game is represented as a JSON file named ID.json, with many fields being the same as in Truncated Logs format. Refer to that representation description for the description of fields in the files. A detailed description of this format can be found in the Representation_2.rtf file.

Truncated logs format: This data collection contains slightly modified JSON log files generated by the Tactical Troops: Anthracite Shift game. Each file represents one match between two players and its name is the ID of a game, the same as in remaining representations. The files do not contain the full game description but are truncated after a number of turns. Some summary statistics of a match are also removed. Obviously, we removed information about who the winner is. Other game metadata such as player IDs are anonymized. Each JSON file contains a number of data structures, mostly in the form of lists. A detailed description of this format can be found in the Representation_3.rtf file.

Screenshots format: This data format contains screenshots generated by the Replay application of the Tactical Troops: Anthracite Shift game. (see an example at https://sensei.tacticaltroops.net/analysis/replay/126332/). Each JPEG file follows the naming scheme of ID.jpg and represents the state of the game at the decision timestamp. The dimensions of each image are compatible with the binary maps found in Metadata. Additionally, we provide base color map images for each game version, which may be found in the base_color_images.7z file. On the screenshots, players’ units, captured control points, and generators are color-marked in accordance with the values found in the Flattened and Truncated logs format under the ‘Players’ key in the ‘c’ field. For convenience, those values may also be found in the team_colors.csv file, which follows the same scheme as the Tabular format (colors.TEAM_ID.{r,g,b,hex}).

Unit, gadgets, maps, etc. metadata: Each game version comes with slightly different parameters of various items, changed as the game evolved. We provide such metadata for each version of the game. The details might be found in the Metadata.pdf file.

Remarks:

  • There is a large variety of weapons and weapon types with varying properties, e.g., plasma guns are not teleported (they fly over teleports), some guns shoot two bullets at once. Some projectiles have high piercing e.g. sniper bullets can fly through objects inflicting damage (one can kill two units with one shot). Some weapons, e.g., heavy machine guns can restrict the range of movement of a unit.
  • There is a wide array of gadgets in the TT:AS game. Some are destructive and can inflict damage (grenades, mines), some can be selected only once and have a passive effect (armors), some can decrease movement range (armors,  cloaks). Some last only one round (two consequent turn of both players).
  • Units (soldiers) come in four kinds: Assault, Heavy, Support, Scout. They vary in their moving range, initial HP, and special abilities (called ‘perks’). For e.g. Heavy can bash through boxes, Support can supplement HP of any nearby unit.

Solution format: The predictions for test instances from the test_data_tabular_format.csv table should be submitted to the online evaluation system as a textual file. The file should have exactly 20000 lines, and each line should contain exactly one number from the [0,1] interval indicating the chance that player 1 wins the game. The ordering of predictions should be the same as the ordering of instances in the test_data_tabular_format.csv table.

Evaluation: the quality of submissions will be evaluated using the AUC measure. Solutions will be evaluated online and the preliminary results will be published on the public leaderboard. The preliminary score will be computed on a small subset of the test records, fixed for all participants. The final evaluation will be performed after the completion of the competition using the remaining part of the test records. Those results will also be published online. It is important to note that only teams which submit a report describing their approach before the end of the challenge will qualify for the final evaluation.

In order to download competition files you need to be enrolled.
Rank Team Name Score Submission Date
1
henryvu
0.8926 2021-09-18 04:04:17
2
Cyan
0.8922 2021-08-19 08:57:50
3
tks
0.8920 2021-07-31 14:35:13
4
GameOver
0.8889 2021-09-17 19:35:07
5
ML
0.8881 2021-08-29 17:24:30
6
Dymitr
0.8870 2021-07-19 00:22:51
7
Sparked
0.8837 2021-06-7 18:11:26
8
MapReduceEverything
0.8809 2021-06-10 22:08:32
9
PKAW
0.8803 2021-06-11 23:53:01
10
Mathurin ACHE
0.8781 2021-09-17 08:59:02
11
msjj
0.8773 2021-05-18 19:52:26
12
sink
0.8765 2021-05-3 21:09:14
13
Rafał Czarnecki
0.8764 2021-06-19 01:09:36
14
Krzysztof Witczyński
0.8747 2021-06-7 16:10:00
15
ryszard
0.8740 2021-06-12 20:40:55
16
player_2
0.8739 2021-06-11 21:00:29
17
grzybola
0.8739 2021-06-11 21:08:10
18
baseline solution
0.8733 2021-04-30 22:13:57
19
Krzysztof Grabiec
0.8731 2021-06-5 20:18:05
20
PFTW
0.8720 2021-06-11 15:37:45
21
AndrewB
0.8720 2021-06-11 20:12:54
22
Aidan
0.8689 2021-05-26 17:58:14
23
bs_dm
0.8681 2021-06-12 14:45:14
24
i_hacked_test_engine
0.8680 2021-06-12 02:54:25
25
Pietruszek team
0.8675 2021-06-2 13:38:00
26
DM42
0.8663 2021-06-11 22:54:40
27
Debojit Mandal
0.8654 2021-08-28 15:28:55
28
td
0.8627 2021-08-5 13:12:32
29
Piotr Grabowski
0.8624 2021-06-13 15:20:34
30
DeepTeam
0.8546 2021-05-30 22:02:16
31
JG+AK
0.8508 2021-06-13 10:12:56
32
sn_twlo
0.8507 2021-07-18 20:26:28
33
Kasia
0.8469 2021-06-11 18:21:32
34
唐牛才是食神
0.8225 2021-06-7 10:46:56
35
CryptoCrackers
0.7829 2021-07-9 19:27:48
36
lgac
0.7796 2021-08-18 23:30:26
37
nowy team
0.7773 2021-06-11 00:54:05
38
RandomPredictorGoBrrrrrrrrrrrrrrrrrrrrrrrrrrrrr
0.7698 2021-06-10 23:04:03
39
DU
0.7673 2021-05-4 13:16:34
40
hubert
0.7659 2021-05-11 17:47:16
41
TJP2
0.7638 2021-08-25 13:18:57
42
berendkatya
0.5000 2021-06-2 15:09:01
43
stograd
0.5000 2021-07-23 17:12:41
44
NehzUx
0.5000 2021-08-31 10:17:41

The competition timeline:

  • April 30, 2021: start of the competition, datasets become available,
  • August 31September 30, 2021: deadline for submitting the solutions,
  • September 5 October 3, 2021: deadline for sending the reports, end of the competition,
  • September 12 October 4-6, 2021: online publication of the final results, sending invitations for submitting papers to the associated workshop at the IEEE Big Data 2021 conference,
  • October 10 October 25, 2021: deadline for submitting invited papers,
  • October 25 November 1, 2021: notification of paper acceptance,
  • November 15, 2021: camera-ready versions of accepted papers due.

Authors of the top-ranked solutions (based on the final evaluation scores) will be awarded prizes funded by QED Software:

  • 1000 USD for the winning solution (+ the cost of one IEEE BigData 2021 registration)
  • 500 USD for the 2nd place solution (+ the cost of one IEEE BigData 2021 registration)
  • 250 USD for the 3rd place solution (+ the cost of one IEEE BigData 2021 registration)

The award ceremony will take place during the special track at IEEE BigData 2021 conference (online).

  • Andrzej Janusz, QED Software & University of Warsaw
  • Maciej Świechowski, QED Software & Warsaw University of Technology
  • Maciej Matraszek, QED Software & University of Warsaw
  • Dominik Ślęzak, QED Software & University of Warsaw

In case of any questions please post on the competition forum or write an email at contact {at} knowledgepit.ml

QED Software is an AI products company and a technological enablement partner for their clients - pioneers in their field. Their products reduce the uncertainty of Machine Learning models to align modeling with business understanding, to prioritize enterprise data efforts, to support experts in the ML loop, as well as to reduce data footprint and compactify the models so as to be used by the Internet of Things applications.

This forum is for all users to discuss matters related to the competition. Good manners apply!
  Discussion Author Replies Last post
Updated submission system Maciej 0 by Maciej
Monday, September 13, 2021, 12:35:03
Competition schedule update Maciej 0 by Maciej
Monday, August 23, 2021, 18:50:46
Short maintenance break Andrzej 0 by Andrzej
Monday, July 19, 2021, 11:42:08
Size of the preliminary evaluation set Dymitr 1 by Andrzej
Tuesday, July 06, 2021, 17:17:53
Missing data 1 by Andrzej
Saturday, June 26, 2021, 11:44:46
Game ordering Timothy 4 by Timothy
Saturday, June 19, 2021, 13:27:50
New data representation available Maciej 0 by Maciej
Wednesday, June 02, 2021, 00:08:03
Metadata files updated Maciej 0 by Maciej
Wednesday, May 05, 2021, 16:22:05
sample for results. ngọc phương 1 by Maciej
Saturday, May 01, 2021, 19:37:11
Some doubt jayesh 1 by Maciej
Saturday, May 01, 2021, 19:34:27