May 7, 2024, 4:50 a.m. | Hieu Ngo Trung, Duong Tran Ham, Tin Huynh, Kiem Hoang

cs.CL updates on arXiv.org arxiv.org

arXiv:2405.02573v1 Announce Type: new
Abstract: Recently, many studies have shown the efficiency of using Bidirectional Encoder Representations from Transformers (BERT) in various Natural Language Processing (NLP) tasks. Specifically, English spelling correction task that uses Encoder-Decoder architecture and takes advantage of BERT has achieved state-of-the-art result. However, to our knowledge, there is no implementation in Vietnamese yet. Therefore, in this study, a combination of Transformer architecture (state-of-the-art for Encoder-Decoder model) and BERT was proposed to deal with Vietnamese spelling correction. The …

abstract architecture art arxiv bert combination cs.cl decoder efficiency encoder encoder-decoder english however knowledge language language processing natural natural language natural language processing nlp processing state studies tasks transformer transformers type

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US