Skip to content

Transformer for Knowledge Tracing Implemented with Trax

Notifications You must be signed in to change notification settings

CalebEverett/riiid_transformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

Intro

This notebook trains a transformer model on the EdNet dataset using the google/trax library. The EdNet dataset is large set of student responses to multiple choice questions related to English language learning. A recent Kaggle competition, Riiid! Answer Correctness Prediction, provided as subset of this data, consisting of 100 million responses to 13 thousand questions from 300 thousand students.

The state of the art result, detailed in SAINT+: Integrating Temporal Features for EdNet Correctness Prediction, achieves an AUC ROC of 0.7914. The winning solution in the Riiid! Answer Correctness Prediction competition achieved an AUC ROC of 0.820. This notebook achieves an AUC ROC of 0.776 implementing an approach similar to the state of the art approach, training for 25,000 steps. It demonstrates several techniques that may be useful to those getting started with the google/trax library or deep learning in general. This notebook demonstrates how to:

  • Use BigQuery to perform feature engineering
  • Create TFRecords with multiple sequences per record
  • Modify the trax Transformer model to accommodate a knowledge tracing dataset:
    • Utilize multiple encoder and decoder embeddings - aggregated either by concatenation or sum
    • Include a custom metric - AUC ROC
    • Utilize a combined padding and future mask
  • Use trax's gin-config integration to specify training parameters
  • Display training progress using trax's tensorboard integration

Open In Colab

About

Transformer for Knowledge Tracing Implemented with Trax

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published