all AI news
T5 Fine-tuning for Summarization with multiple GPUs
June 28, 2022, 5:49 a.m. | /u/gulab__jamun
Natural Language Processing www.reddit.com
I hope you are fine.
I have fine-tuned T5 (t5-small) for summarization with the English Dataset of movie reviews following the script from the hugging face course. The link is given below.
https://huggingface.co/course/chapter7/5?fw=pt
The script is working perfectly and I am getting good results for summarization but I want to fine-tune t5 for summarization using multiple GPUs.
I have observed that Nvidia Megatron-LM can be used for fine-tuning t5 for summarization using multiple GPUs but is there any …
More from www.reddit.com / Natural Language Processing
Anyone working on mathematics of transformers?
1 day, 19 hours ago |
www.reddit.com
What Do You Love About NLP?
2 days, 7 hours ago |
www.reddit.com
Feeling so inferior in the NLP job market.
6 days, 20 hours ago |
www.reddit.com
NLP: building a sentiment model
6 days, 21 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Business Intelligence Analyst
@ Rappi | COL-Bogotá
Applied Scientist II
@ Microsoft | Redmond, Washington, United States