all AI news
Learning in Convolutional Neural Networks Accelerated by Transfer Entropy
April 5, 2024, 4:41 a.m. | Adrian Moldovan, Angel Ca\c{t}aron, R\u{a}zvan Andonie
cs.LG updates on arXiv.org arxiv.org
Abstract: Recently, there is a growing interest in applying Transfer Entropy (TE) in quantifying the effective connectivity between artificial neurons. In a feedforward network, the TE can be used to quantify the relationships between neuron output pairs located in different layers. Our focus is on how to include the TE in the learning mechanisms of a Convolutional Neural Network (CNN) architecture. We introduce a novel training mechanism for CNN architectures which integrates the TE feedback connections. …
abstract artificial arxiv connectivity convolutional neural networks cs.ai cs.it cs.lg entropy focus math.it network networks neural networks neuron neurons relationships transfer type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
#13721 - Data Engineer - AI Model Testing
@ Qualitest | Miami, Florida, United States
Elasticsearch Administrator
@ ManTech | 201BF - Customer Site, Chantilly, VA