all AI news
[D] Why use hard-coded tokenization in NLP instead of a learned tokenization?
Aug. 13, 2022, 3:38 a.m. | /u/QLaHPD
Machine Learning www.reddit.com
Why not use a prior network that on input receives the raw text, and generate an learned output to the main network. I believe that letting the neural network itself tokenize the text is the best way to process …
More from www.reddit.com / Machine Learning
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Stagista Technical Data Engineer
@ Hager Group | BRESCIA, IT
Data Analytics - SAS, SQL - Associate
@ JPMorgan Chase & Co. | Mumbai, Maharashtra, India