Web: http://arxiv.org/abs/2201.11517

Jan. 28, 2022, 2:11 a.m. | Bhavin Choksi, Milad Mozafari, Rufin VanRullen, Leila Reddy

cs.LG updates on arXiv.org arxiv.org

The human hippocampus possesses "concept cells", neurons that fire when
presented with stimuli belonging to a specific concept, regardless of the
modality. Recently, similar concept cells were discovered in a multimodal
network called CLIP (Radford et at., 2021). Here, we ask whether CLIP can
explain the fMRI activity of the human hippocampus better than a purely visual
(or linguistic) model. We extend our analysis to a range of publicly available
uni- and multi-modal models. We demonstrate that "multimodality" stands out …

arxiv bio multimodal networks neural neural networks

More from arxiv.org / cs.LG updates on arXiv.org

Data Analytics and Technical support Lead

@ Coupa Software, Inc. | Bogota, Colombia

Data Science Manager

@ Vectra | San Jose, CA

Data Analyst Sr

@ Capco | Brazil - Sao Paulo

Data Scientist (NLP)

@ Builder.ai | London, England, United Kingdom - Remote

Senior Data Analyst

@ BuildZoom | Scottsdale, AZ/ San Francisco, CA/ Remote

Senior Research Scientist, Speech Recognition

@ SoundHound Inc. | Toronto, Canada