all AI news
How Hugging Face improved Text Generation performance with XLA
Nov. 28, 2022, 7:30 p.m. | noreply@blogger.com (TensorFlow Blog)
The TensorFlow Blog blog.tensorflow.org
Posted by The Hugging Face Team 🤗
Language models have bloomed in the past few years thanks to the advent of the Transformer architecture. Although Transformers can be used in many NLP applications, one is particularly alluring: text generation. It caters to the practical goals of automating verbal tasks and to our dreams of future interactions with chatbots.
Text generation can significantly impact user experiences. So, optimizing the generation process for throughput and latency is crucial. On that end, …
community face hugging face performance text text generation xla
More from blog.tensorflow.org / The TensorFlow Blog
Faster Dynamically Quantized Inference with XNNPack
1 month, 1 week ago |
blog.tensorflow.org
What's new in TensorFlow 2.16
2 months ago |
blog.tensorflow.org
Graph neural networks in TensorFlow
3 months, 1 week ago |
blog.tensorflow.org
TensorFlow 2.15 update: hot-fix for Linux installation issue
5 months, 2 weeks ago |
blog.tensorflow.org
Half-precision Inference Doubles On-Device Inference Performance
5 months, 2 weeks ago |
blog.tensorflow.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US