May 26, 2022, 1:11 a.m. | Vasu Goel, Dhruv Sahnan, Venktesh V, Gaurav Sharma, Deep Dwivedi, Mukesh Mohania

cs.CL updates on arXiv.org arxiv.org

Online education platforms are powered by various NLP pipelines, which
utilize models like BERT to aid in content curation. Since the inception of the
pre-trained language models like BERT, there have also been many efforts toward
adapting these pre-trained models to specific domains. However, there has not
been a model specifically adapted for the education domain (particularly K-12)
across subjects to the best of our knowledge. In this work, we propose to train
a language model on a corpus of …

arxiv bert education k-12 k-12 education

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Field Sample Specialist (Air Sampling) - Eurofins Environment Testing – Pueblo, CO

@ Eurofins | Pueblo, CO, United States

Camera Perception Engineer

@ Meta | Sunnyvale, CA