all AI news
Please help understand example code from Hugging Face for GPT2
July 20, 2022, 2:15 a.m. | /u/beatleinabox
Natural Language Processing www.reddit.com
`model = GPT2LMHeadModel.from_pretrained("gpt2")`
`inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")`
`outputs = model(**inputs, labels=inputs["input_ids"])`
`loss = outputs.loss`
`logits = outputs.logits`
​
The code above is from hugging face documentation. Assuming I have an optimizer and do something like:
`loss.backward()`
`optimizer.step()`
Am I successfully "finetuning" the model with one input example? What exactly is happening in the line:
`outputs = model(**inputs, labels=inputs["input_ids"])`
Why is labels the input ids of the inputs?
Is it training as follows? Given …
More from www.reddit.com / Natural Language Processing
The Languages AI Is Leaving Behind
3 days, 4 hours ago |
www.reddit.com
Feeling so inferior in the NLP job market.
4 days, 1 hour ago |
www.reddit.com
NLP: building a sentiment model
4 days, 2 hours ago |
www.reddit.com
ReFT: Representation Finetuning for Language Models
1 week, 4 days ago |
www.reddit.com
Jobs in AI, ML, Big Data
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
[Job - 14823] Senior Data Scientist (Data Analyst Sr)
@ CI&T | Brazil
Data Engineer
@ WorldQuant | Hanoi
ML Engineer / Toronto
@ Intersog | Toronto, Ontario, Canada
Analista de Business Intelligence (Industry Insights)
@ NielsenIQ | Cotia, Brazil