April 17, 2023, 8:19 p.m. | Yaohua Zha, Jinpeng Wang, Tao Dai, Bin Chen, Zhi Wang, Shu-Tao Xia

cs.CV updates on arXiv.org arxiv.org

Recently, pre-trained point cloud models have found extensive applications in
downstream tasks like object classification. However, these tasks often require
{full fine-tuning} of models and lead to storage-intensive procedures, thus
limiting the real applications of pre-trained models. Inspired by the great
success of visual prompt tuning (VPT) in vision, we attempt to explore prompt
tuning, which serves as an efficient alternative to full fine-tuning for
large-scale models, to point cloud pre-trained models to reduce storage costs.
However, it is non-trivial …

applications apply arxiv classification cloud costs dynamic fine-tuning large-scale models pre-trained models prompt prompt tuning reduce scale storage success vision

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US