all AI news
ProTeCt: Prompt Tuning for Taxonomic Open Set Classification
March 29, 2024, 4:45 a.m. | Tz-Ying Wu, Chih-Hui Ho, Nuno Vasconcelos
cs.CV updates on arXiv.org arxiv.org
Abstract: Visual-language foundation models, like CLIP, learn generalized representations that enable zero-shot open-set classification. Few-shot adaptation methods, based on prompt tuning, have been shown to further improve performance on downstream datasets. However, these methods do not fare well in the taxonomic open set (TOS) setting, where the classifier is asked to make predictions from label sets across different levels of semantic granularity. Frequently, they infer incorrect labels at coarser taxonomic class levels, even when the inference …
abstract arxiv classification classifier clip cs.cv datasets few-shot foundation generalized however language learn performance prompt prompt tuning protect set type visual zero-shot
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote