March 2, 2024, 7:13 p.m. | /u/Raxume

Deep Learning www.reddit.com

Hey everyone,

I've recently written an article demonstrating some possibilities for using Apple Silicon's GPU for Deep Learning tasks. I know that many folks aren't aware of this. You can get some reasonable inference from your chip. I hope you like it.


[https://towardsdatascience.com/3-ways-to-leverage-apple-silicons-gpu-for-deep-learning-2cbb5b268b76?sk=b48e57602164e9bff7dfdeca93812425](https://towardsdatascience.com/3-ways-to-leverage-apple-silicons-gpu-for-deep-learning-2cbb5b268b76?sk=b48e57602164e9bff7dfdeca93812425)

apple apple silicon article chip deep learning deeplearning gpu hey inference silicon tasks

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Machine Learning Engineer

@ Apple | Sunnyvale, California, United States