all AI news
For text classification, how are these 2 different?
Nov. 16, 2023, 2:35 p.m. | /u/ytu876
Natural Language Processing www.reddit.com
I read somewhere about these 2 approaches for text classification
1. BERT + a new head layer: freeze the BERT, train the new head layer
2. SBERT + logistic regression: use SBERT to generate the embedding, and train the logistic layer as a traditional classifier
How are these 2 different? I believe in approach #1, the head layer is just a regular feed forward, which is not much different from a logistic regression. Both are using LLM to convert …
bert classification classifier embedding generate head languagetechnology layer logistic regression regression sbert text text classification train
More from www.reddit.com / Natural Language Processing
Which NLP-master programs in Europe are more cs-leaning?
1 day, 8 hours ago |
www.reddit.com
What do you think is the state of the art technique for matching a piece …
3 days, 6 hours ago |
www.reddit.com
Multilabel text classification on unlabled data
3 days, 19 hours ago |
www.reddit.com
AI-proof language-related jobs in the United States?
6 days, 1 hour ago |
www.reddit.com
Did we just receive an AI-generated meta-review?
1 week, 1 day ago |
www.reddit.com
Found a Way to Keep Transcripts Going 24/7
1 week, 2 days ago |
www.reddit.com
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne