all AI news
PTQ4SAM: Post-Training Quantization for Segment Anything
May 7, 2024, 4:44 a.m. | Chengtao Lv, Hong Chen, Jinyang Guo, Yifu Ding, Xianglong Liu
cs.LG updates on arXiv.org arxiv.org
Abstract: Segment Anything Model (SAM) has achieved impressive performance in many computer vision tasks. However, as a large-scale model, the immense memory and computation costs hinder its practical deployment. In this paper, we propose a post-training quantization (PTQ) framework for Segment Anything Model, namely PTQ4SAM. First, we investigate the inherent bottleneck of SAM quantization attributed to the bimodal distribution in post-Key-Linear activations. We analyze its characteristics from both per-tensor and per-channel perspectives, and propose a Bimodal …
arxiv cs.cv cs.lg quantization segment segment anything training type
More from arxiv.org / cs.LG updates on arXiv.org
Efficient Data-Driven MPC for Demand Response of Commercial Buildings
2 days, 21 hours ago |
arxiv.org
Testing the Segment Anything Model on radiology data
2 days, 21 hours ago |
arxiv.org
Calorimeter shower superresolution
2 days, 21 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US