all AI news
Learning Multi-Task Gaussian Process Over Heterogeneous Input Domains. (arXiv:2202.12636v2 [stat.ML] UPDATED)
Web: http://arxiv.org/abs/2202.12636
June 17, 2022, 1:12 a.m. | Haitao Liu, Kai Wu, Yew-Soon Ong, Chao Bian, Xiaomo Jiang, Xiaofang Wang
stat.ML updates on arXiv.org arxiv.org
Multi-task Gaussian process (MTGP) is a well-known non-parametric Bayesian
model for learning correlated tasks effectively by transferring knowledge
across tasks. But current MTGPs are usually limited to the multi-task scenario
defined in the same input domain, leaving no space for tackling the
heterogeneous case, i.e., the features of input domains vary over tasks. To
this end, this paper presents a novel heterogeneous stochastic variational
linear model of coregionalization (\texttt{HSVLMC}) model for simultaneously
learning the tasks with varied input domains. Particularly, …
More from arxiv.org / stat.ML updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY