Figure 1: RCL contrastive learning modules from SURCL paper

Knowledge Tracing with Contrastive Learning

Proposed ACCL and RCL contrastive learning methods at RIIID, achieving state-of-the-art on student modeling across 6 benchmarks (dropout prediction, knowledge tracing). Deployed to Santa TOEIC platform.

October 15, 2022 · 2 min · Jungbae Park
ML Infrastructure: Model Registry & Pipeline (RIIID)

ML Model Registry, Dataset Pipeline & Infrastructure at RIIID (뤼이드)

Built ML model registry (MLFlow) and dataset pipelines (Airflow, Athena, BigQuery) at RIIID, serving 4+ products including SANTA TOEIC, IVYGlobal SAT, CASA GRANDE, and INICIE.

September 15, 2021 · 1 min · Jungbae Park
ML Pipeline Acceleration: GPU Utilization 25% to 95%

ML Pipeline Acceleration & Multi-GPU Training at RIIID (뤼이드)

Introduced RIIID’s first multi-GPU training, boosting GPU utilization from 25% to 95% and cutting initialization time from 1 hour to 10 seconds. Built CI/CD pipelines with GitHub Actions.

December 15, 2020 · 1 min · Jungbae Park