all AI news
Migrate Demographic Group For Fair GNNs
March 26, 2024, 4:44 a.m. | YanMing Hu, TianChi Liao, JiaLong Chen, Jing Bian, ZiBin Zheng, Chuan Chen
cs.LG updates on arXiv.org arxiv.org
Abstract: Graph Neural networks (GNNs) have been applied in many scenarios due to the superior performance of graph learning. However, fairness is always ignored when designing GNNs. As a consequence, biased information in training data can easily affect vanilla GNNs, causing biased results toward particular demographic groups (divided by sensitive attributes, such as race and age). There have been efforts to address the fairness issue. However, existing fair techniques generally divide the demographic groups by raw …
abstract arxiv cs.cy cs.lg data designing fair fairness gnns graph graph learning graph neural networks however information networks neural networks performance results training training data type
More from arxiv.org / cs.LG updates on arXiv.org
Digital Over-the-Air Federated Learning in Multi-Antenna Systems
2 days, 10 hours ago |
arxiv.org
Bagging Provides Assumption-free Stability
2 days, 10 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Field Sample Specialist (Air Sampling) - Eurofins Environment Testing – Pueblo, CO
@ Eurofins | Pueblo, CO, United States
Camera Perception Engineer
@ Meta | Sunnyvale, CA