all AI news
[D] BitNet 1-b/b1.58 LLMs - is that a threat to nvidia?
March 2, 2024, 10:42 a.m. | /u/tunggad
Machine Learning www.reddit.com
is that real? it sounds too good to be real right? If it is true, it not only reduces VRAM capacity and bandwidth required to train and run LLMs, it also suggests simplified hardware implementation due to the lack of need for matmul , it only needs + operation
is that not a threat for nvidia (stock) and amd as well ?
amd bandwidth capacity good hardware implementation llms machinelearning nvidia simplified stock threat train true
More from www.reddit.com / Machine Learning
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US