April 19, 2024, 4:42 a.m. | Emanuele La Malfa, Gabriele La Malfa, Giuseppe Nicosia, Vito Latora

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.11172v2 Announce Type: replace
Abstract: Deep Neural Networks (DNNs) can be represented as graphs whose links and vertices iteratively process data and solve tasks sub-optimally. Complex Network Theory (CNT), merging statistical physics with graph theory, provides a method for interpreting neural networks by analysing their weights and neuron structures. However, classic works adapt CNT metrics that only permit a topological analysis as they do not account for the effect of the input data. In addition, CNT metrics have been applied …

abstract arxiv cs.ai cs.lg data graph graphs however merging network networks neural networks neuron perspective physics process solve statistical tasks theory type via

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

AI Engineering Manager

@ M47 Labs | Barcelona, Catalunya [Cataluña], Spain