March 21, 2024, 4:42 a.m. | Samy Blusseau (CMM)

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.12975v1 Announce Type: cross
Abstract: Morphological neural networks, or layers, can be a powerful tool to boost the progress in mathematical morphology, either on theoretical aspects such as the representation of complete lattice operators, or in the development of image processing pipelines. However, these architectures turn out to be difficult to train when they count more than a few morphological layers, at least within popular machine learning frameworks which use gradient descent based optimization algorithms. In this paper we investigate …

abstract architectures arxiv boost cs.cv cs.lg development gradient however image image processing insights lattice networks neural networks operators pipelines processing progress representation stat.ml tool training type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Reporting & Data Analytics Lead (Sizewell C)

@ EDF | London, GB

Data Analyst

@ Notable | San Mateo, CA