March 18, 2024, 4:42 a.m. | Ahcen Aliouat, Elsa Dupraz

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.10202v1 Announce Type: cross
Abstract: In goal-oriented communications, the objective of the receiver is often to apply a Deep-Learning model, rather than reconstructing the original data. In this context, direct learning over compressed data, without any prior decoding, holds promise for enhancing the time-efficient execution of inference models at the receiver. However, conventional entropic-coding methods like Huffman and Arithmetic break data structure, rendering them unsuitable for learning without decoding. In this paper, we propose an alternative approach in which entropic …

abstract apply arxiv communications context cs.ai cs.cv cs.it cs.lg data decoding eess.iv however images inference math.it prior type

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US