Web: http://arxiv.org/abs/2201.05809

Jan. 24, 2022, 2:11 a.m. | Qiushi Shi, Ponnuthurai Nagaratnam Suganthan, Rakesh Katuwal

cs.LG updates on arXiv.org arxiv.org

In this paper, we first introduce batch normalization to the edRVFL network.
This re-normalization method can help the network avoid divergence of the
hidden features. Then we propose novel variants of Ensemble Deep Random Vector
Functional Link (edRVFL). Weighted edRVFL (WedRVFL) uses weighting methods to
give training samples different weights in different layers according to how
the samples were classified confidently in the previous layer thereby
increasing the ensemble's diversity and accuracy. Furthermore, a pruning-based
edRVFL (PedRVFL) has also been …

arxiv classification data data classification deep ensemble network random tabular data

More from arxiv.org / cs.LG updates on arXiv.org

Predictive Ecology Postdoctoral Fellow

@ Lawrence Berkeley National Lab | Berkeley, CA

Data Analyst, Patagonia Action Works

@ Patagonia | Remote

Data & Insights Strategy & Innovation General Manager

@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX

Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis

@ Ahmedabad University | Ahmedabad, India

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL