Specification and evaluation of an assessment engine for educational games

empowering educators with an assessment editor and a learning analytics dashboard

Yaelle Chaudy, Thomas Connolly

Research output: Contribution to journalArticle

46 Downloads (Pure)

Abstract

Assessment is a crucial aspect of any teaching and learning process. Educational games offer promising advantages for assessment; personalised feedback to students and automated assessment process. However, while many teachers agree that educational games increase motivation, learning and retention, few of them are ready to fully trust them as an assessment tool. We believe there are two main reasons for this lack of trust: educators are not given sufficient information about the gameplays, and many educational games are distributed as black-boxes, unmodifiable by teachers. This paper presents an assessment engine designed to separate a game and its assessment. It allows teachers to modify a game’s assessment after distribution and visualise gameplay data via a learning analytics dashboard. The engine was evaluated quantitatively by 31 educators. Findings were overall very positive: both the assessment editor and the learning analytics dashboard were rated useful and easy to use. The evaluation also indicates that, having access to EngAGe, educators would be more likely to trust a game’s assessment. This paper concludes that EngAGe can be used by educators effectively to modify educational games’ assessment and visualise gameplay data, and that it contributes to increasing their trust in educational games as an assessment tool.
Original languageEnglish
Article numberENTCOM269
Pages (from-to)209-224
Number of pages16
JournalEntertainment Computing
Volume27
DOIs
Publication statusPublished - 26 Jul 2018

Fingerprint

Engines
Specifications
Teaching
Students
Feedback

Keywords

  • Educational games
  • assessment
  • learning analytics
  • assessment editor
  • assessment engine

Cite this

@article{24e52e71f9e54eab9cd763f4ff45d1d1,
title = "Specification and evaluation of an assessment engine for educational games: empowering educators with an assessment editor and a learning analytics dashboard",
abstract = "Assessment is a crucial aspect of any teaching and learning process. Educational games offer promising advantages for assessment; personalised feedback to students and automated assessment process. However, while many teachers agree that educational games increase motivation, learning and retention, few of them are ready to fully trust them as an assessment tool. We believe there are two main reasons for this lack of trust: educators are not given sufficient information about the gameplays, and many educational games are distributed as black-boxes, unmodifiable by teachers. This paper presents an assessment engine designed to separate a game and its assessment. It allows teachers to modify a game’s assessment after distribution and visualise gameplay data via a learning analytics dashboard. The engine was evaluated quantitatively by 31 educators. Findings were overall very positive: both the assessment editor and the learning analytics dashboard were rated useful and easy to use. The evaluation also indicates that, having access to EngAGe, educators would be more likely to trust a game’s assessment. This paper concludes that EngAGe can be used by educators effectively to modify educational games’ assessment and visualise gameplay data, and that it contributes to increasing their trust in educational games as an assessment tool.",
keywords = "Educational games, assessment, learning analytics, assessment editor, assessment engine",
author = "Yaelle Chaudy and Thomas Connolly",
year = "2018",
month = "7",
day = "26",
doi = "10.1016/j.entcom.2018.07.003",
language = "English",
volume = "27",
pages = "209--224",
journal = "Entertainment Computing",
issn = "1875-9521",
publisher = "Elsevier B.V.",

}

Specification and evaluation of an assessment engine for educational games : empowering educators with an assessment editor and a learning analytics dashboard. / Chaudy, Yaelle; Connolly, Thomas.

In: Entertainment Computing, Vol. 27, ENTCOM269, 26.07.2018, p. 209-224.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Specification and evaluation of an assessment engine for educational games

T2 - empowering educators with an assessment editor and a learning analytics dashboard

AU - Chaudy, Yaelle

AU - Connolly, Thomas

PY - 2018/7/26

Y1 - 2018/7/26

N2 - Assessment is a crucial aspect of any teaching and learning process. Educational games offer promising advantages for assessment; personalised feedback to students and automated assessment process. However, while many teachers agree that educational games increase motivation, learning and retention, few of them are ready to fully trust them as an assessment tool. We believe there are two main reasons for this lack of trust: educators are not given sufficient information about the gameplays, and many educational games are distributed as black-boxes, unmodifiable by teachers. This paper presents an assessment engine designed to separate a game and its assessment. It allows teachers to modify a game’s assessment after distribution and visualise gameplay data via a learning analytics dashboard. The engine was evaluated quantitatively by 31 educators. Findings were overall very positive: both the assessment editor and the learning analytics dashboard were rated useful and easy to use. The evaluation also indicates that, having access to EngAGe, educators would be more likely to trust a game’s assessment. This paper concludes that EngAGe can be used by educators effectively to modify educational games’ assessment and visualise gameplay data, and that it contributes to increasing their trust in educational games as an assessment tool.

AB - Assessment is a crucial aspect of any teaching and learning process. Educational games offer promising advantages for assessment; personalised feedback to students and automated assessment process. However, while many teachers agree that educational games increase motivation, learning and retention, few of them are ready to fully trust them as an assessment tool. We believe there are two main reasons for this lack of trust: educators are not given sufficient information about the gameplays, and many educational games are distributed as black-boxes, unmodifiable by teachers. This paper presents an assessment engine designed to separate a game and its assessment. It allows teachers to modify a game’s assessment after distribution and visualise gameplay data via a learning analytics dashboard. The engine was evaluated quantitatively by 31 educators. Findings were overall very positive: both the assessment editor and the learning analytics dashboard were rated useful and easy to use. The evaluation also indicates that, having access to EngAGe, educators would be more likely to trust a game’s assessment. This paper concludes that EngAGe can be used by educators effectively to modify educational games’ assessment and visualise gameplay data, and that it contributes to increasing their trust in educational games as an assessment tool.

KW - Educational games

KW - assessment

KW - learning analytics

KW - assessment editor

KW - assessment engine

U2 - 10.1016/j.entcom.2018.07.003

DO - 10.1016/j.entcom.2018.07.003

M3 - Article

VL - 27

SP - 209

EP - 224

JO - Entertainment Computing

JF - Entertainment Computing

SN - 1875-9521

M1 - ENTCOM269

ER -