RMIT University
Browse

Weakly supervised setting for learning concept prerequisite relations using multi-head attention variational graph auto-encoders

journal contribution
posted on 2024-11-02, 20:13 authored by Juntao Zhang, Hai Lan, Xiandig Yang, Shuaichao Zhang, Wei Song, Zhiyong Peng
An increasing number of learners can benefit from educational resources in Massive Open Online Courses (MOOCs) through self-regulated learning. However, it is difficult for learners to organize a suitable learning path from educational resources in MOOCs if they don't know the prerequisite relation between concepts. Manually labeling the prerequisite relation between concepts is time-consuming and requires significant domain knowledge. In addition, it isn't notably easy for large concept datasets to label. How should learners start learning when they face massive knowledge concepts in MOOCs? To address these problems, we propose an end-to-end graph network-based model called Multi-Head Attention Variational Graph Auto-Encoders (MHAVGAE) to automatically label the prerequisite relation between concepts via a resource-concept graph. Firstly, we model a resource-concept graph according to learning resources, concept, and their relations, then introduce the multi-head attention mechanism to operate and compute the hidden representations of each vertex over the resource-concept graph. The purpose is to reduce the cognitive difference of manually labeling and consider the mutual influence between vertices in the resource concept graph. Secondly, we design a gated fusion mechanism to fuse the feature of the resource and concept graphs to enrich concept features. Thirdly, we propose a metric named Resource Prerequisite Reference Distance (RPRD), which generates inaccurate concept prerequisite relations to help us reduce manual labeling and then extend MHAVGAE with the weakly supervised setting for learning concept prerequisite relations. Finally, we conduct numerous experiments to demonstrate the effectiveness of MHAVGAE across multiple widely used metrics compared with the state-of-the-art methods. The experimental results show that the performance of MHAVGAE almost outperforms all the baseline methods and the weakly supervised setting of MHAVGAE is beneficial.

History

Related Materials

  1. 1.
    DOI - Is published in 10.1016/j.knosys.2022.108689
  2. 2.
    ISSN - Is published in 09507051

Journal

Knowledge-Based Systems

Volume

247

Number

108689

Start page

1

End page

14

Total pages

14

Publisher

Elsevier

Place published

Netherlands

Language

English

Copyright

© 2022 Elsevier B.V. All rights reserved.

Former Identifier

2006116677

Esploro creation date

2022-10-21

Usage metrics

    Scholarly Works

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC