Oct. 28, 2022, 1:16 a.m. | Ju-Hyung Lee, Dong-Ho Lee, Eunsoo Sheen, Thomas Choi, Jay Pujara, Joongheon Kim

cs.CL updates on arXiv.org arxiv.org

While semantic communication is expected to bring unprecedented communication
efficiency in comparison to classical communication, many challenges must be
resolved to realize its potential. In this work, we provide a realistic
semantic network dubbed seq2seq-SC, which is compatible to 5G NR and can work
with generalized text dataset utilizing pre-trained language model. We also
utilize a performance metric (SBERT) which can accurately measure semantic
similarity and show that seq2seq-SC achieves superior performance while
extracting semantically meaningful information.

arxiv communication language language model semantic seq2seq systems

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Data Scientist (Database Development)

@ Nasdaq | Bengaluru-Affluence