all AI news
[D] BitNet 1-b/b1.58 LLMs - is that a threat to nvidia?
March 2, 2024, 10:42 a.m. | /u/tunggad
Machine Learning www.reddit.com
is that real? it sounds too good to be real right? If it is true, it not only reduces VRAM capacity and bandwidth required to train and run LLMs, it also suggests simplified hardware implementation due to the lack of need for matmul , it only needs + operation
is that not a threat for nvidia (stock) and amd as well ?
amd bandwidth capacity good hardware implementation llms machinelearning nvidia simplified stock threat train true
More from www.reddit.com / Machine Learning
[P] Google Colab crashes before even training my images dataset.
1 day, 12 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York