Web: http://arxiv.org/abs/2205.02460

May 6, 2022, 1:11 a.m. | Yongqi Zhang, Zhanke Zhou, Quanming Yao, Yong Li

cs.LG updates on arXiv.org arxiv.org

While hyper-parameters (HPs) are important for knowledge graph (KG) learning,
existing methods fail to search them efficiently. To solve this problem, we
first analyze the properties of different HPs and measure the transfer ability
from small subgraph to the full graph. Based on the analysis, we propose an
efficient two-stage search algorithm KGTuner, which efficiently explores HP
configurations on small subgraph at the first stage and transfers the
top-performed configurations for fine-tuning on the large full graph at the
second …

arxiv graph graph learning knowledge knowledge graph learning search

More from arxiv.org / cs.LG updates on arXiv.org

Data Analyst, Patagonia Action Works

@ Patagonia | Remote

Data & Insights Strategy & Innovation General Manager

@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX

Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis

@ Ahmedabad University | Ahmedabad, India

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC