Aug. 11, 2023, 6:45 a.m. | Ilker Oguz, Junjie Ke, Qifei Wang, Feng Yang, Mustafa Yildirim, Niyazi Ulas Dinc, Jih-Liang Hsieh, Christophe Moser, Demetri Psaltis

cs.LG updates on arXiv.org arxiv.org

Neural networks (NN) have demonstrated remarkable capabilities in various
tasks, but their computation-intensive nature demands faster and more
energy-efficient hardware implementations. Optics-based platforms, using
technologies such as silicon photonics and spatial light modulators, offer
promising avenues for achieving this goal. However, training multiple trainable
layers in tandem with these physical systems poses challenges, as they are
difficult to fully characterize and describe with differentiable functions,
hindering the use of error backpropagation algorithm. The recently introduced
Forward-Forward Algorithm (FFA) eliminates the …

arxiv computation energy faster hardware light multiple nature network networks neural network neural networks optics photonics platforms silicon silicon photonics technologies training

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne