all AI news
Seq2Seq-SC: End-to-End Semantic Communication Systems with Pre-trained Language Model. (arXiv:2210.15237v1 [eess.SP])
cs.CL updates on arXiv.org arxiv.org
While semantic communication is expected to bring unprecedented communication
efficiency in comparison to classical communication, many challenges must be
resolved to realize its potential. In this work, we provide a realistic
semantic network dubbed seq2seq-SC, which is compatible to 5G NR and can work
with generalized text dataset utilizing pre-trained language model. We also
utilize a performance metric (SBERT) which can accurately measure semantic
similarity and show that seq2seq-SC achieves superior performance while
extracting semantically meaningful information.
arxiv communication language language model semantic seq2seq systems