all AI news
RPLKG: Robust Prompt Learning with Knowledge Graph. (arXiv:2304.10805v1 [cs.AI])
cs.LG updates on arXiv.org arxiv.org
Large-scale pre-trained models have been known that they are transferable,
and they generalize well on the unseen dataset. Recently, multimodal
pre-trained models such as CLIP show significant performance improvement in
diverse experiments. However, when the labeled dataset is limited, the
generalization of a new dataset or domain is still challenging. To improve the
generalization performance on few-shot learning, there have been diverse
efforts, such as prompt learning and adapter. However, the current few-shot
adaptation methods are not interpretable, and they …
arxiv clip computation cost dataset diverse few-shot learning graph improvement knowledge knowledge graph multimodal performance pre-trained models prompt prompt learning scale show study