all AI news
r-softmax: Generalized Softmax with Controllable Sparsity Rate. (arXiv:2304.05243v3 [cs.LG] UPDATED)
cs.LG updates on arXiv.org arxiv.org
Nowadays artificial neural network models achieve remarkable results in many
disciplines. Functions mapping the representation provided by the model to the
probability distribution are the inseparable aspect of deep learning solutions.
Although softmax is a commonly accepted probability mapping function in the
machine learning community, it cannot return sparse outputs and always spreads
the positive probability to all positions. In this paper, we propose r-softmax,
a modification of the softmax, outputting sparse probability distribution with
controllable sparsity rate. In contrast …
artificial arxiv community deep learning distribution function generalized machine machine learning mapping network neural network paper positive probability rate representation softmax solutions sparsity