RMIT University
Browse

Measuring player skill using dynamic difficulty adjustment

conference contribution
posted on 2024-10-31, 22:13 authored by Simon Demediuk, Marco Tamassia, William Raffe, Fabio ZambettaFabio Zambetta, Florian Mueller, Xiaodong LiXiaodong Li
Video games have a long history of use for educational and training purposes, as they provided increased motivation and learning for players. One of the limitations of using video games in this manner is, players still need to be tested outside of the game environment to test their learning outcomes. Traditionally, determining a player's skill level in a competitive game, requires players to compete directly with each other. Through the application of the Adaptive Training Framework, this work presents a novel method to determine the skill level of the player after each interaction with the video game. This is done by measuring the effort of a Dynamic Difficult Adjustment agent, without the need for direct competition between players. The experiments conducted in this research show that by measuring the players Heuristic Value Average, we can obtain the same ranking of players as state-of-the-art ranking systems, without the need for direct competition.

Funding

Enhancing the Australian theme park experience by harnessing virtual-physical play

Australian Research Council

Find out more...

History

Start page

1

End page

7

Total pages

7

Outlet

Proceedings of the Australasian Computer Science Week Multiconference ACSW '18

Name of conference

ACSW '18

Publisher

Association for Computing Machinery

Place published

Brisbane, Queensland, Australia

Start date

2018-01-29

End date

2018-02-02

Language

English

Copyright

© 2018 Association for Computing Machinery

Former Identifier

2006085408

Esploro creation date

2020-06-22

Fedora creation date

2018-09-19

Usage metrics

    Scholarly Works

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC