Web: https://www.reddit.com/r/MachineLearning/comments/xgqwvu/r_hydra_attention_efficient_attention_with_many/

Sept. 17, 2022, 4:26 p.m. | /u/Singularian2501

Machine Learning reddit.com

Paper: [https://arxiv.org/abs/2209.07484](https://arxiv.org/abs/2209.07484)


>While transformers have begun to dominate many tasks in vision, applying them to large images is still computationally difficult. A large reason for this is that self-attention scales quadratically with the number of tokens, which in turn, scales quadratically with the image size. On larger images (e.g., 1080p), over 60% of the total computation in the network is spent solely on creating and applying attention matrices. We take a step toward solving this issue by introducing Hydra …

attention hydra machinelearning meta meta ai standard

Research Scientists

@ ODU Research Foundation | Norfolk, Virginia

Embedded Systems Engineer (Robotics)

@ Neo Cybernetica | Bedford, New Hampshire

2023 Luis J. Alvarez and Admiral Grace M. Hopper Postdoc Fellowship in Computing Sciences

@ Lawrence Berkeley National Lab | San Francisco, CA

Senior Manager Data Scientist

@ NAV | Remote, US

Senior AI Research Scientist

@ Earth Species Project | Remote anywhere

Research Fellow- Center for Security and Emerging Technology (Multiple Opportunities)

@ University of California Davis | Washington, DC

Staff Fellow - Data Scientist

@ U.S. FDA/Center for Devices and Radiological Health | Silver Spring, Maryland

Staff Fellow - Senior Data Engineer

@ U.S. FDA/Center for Devices and Radiological Health | Silver Spring, Maryland

Machine Learning Data Engineer Intern (Jyoti Dharna)

@ Benson Hill | St. Louis, Missouri

Software Engineer / SDE I, Chime SDK Video Research Engineering

@ Amazon.com | East Palo Alto, California, USA

IND (New) Senior ML Ops Engineer - WiQ

@ Quantium | Hyderabad

Data Engineer

@ LendingTree | Remote