all AI news
The Random Transformer
Jan. 10, 2024, 5:09 a.m. |
Simon Willison's Weblog simonwillison.net
"Understand how transformers work by demystifying all the math behind them" - Omar Sanseviero from Hugging Face meticulously implements the transformer architecture behind LLMs from scratch using Python and numpy. There's a lot to take in here but it's all very clearly explained.
Via Hacker News
ai architecture explained face generativeai hacker hugging face llms math numpy python random them transformer transformer architecture transformers via work
More from simonwillison.net / Simon Willison's Weblog
Fast groq-hosted LLMs vs browser jank
1 day, 3 hours ago |
simonwillison.net
Understand errors and warnings better with Gemini
2 days, 18 hours ago |
simonwillison.net
Commit: Add a shared credentials relationship from twitter.com to x.com
2 days, 20 hours ago |
simonwillison.net
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US