all AI news
Table Pre-training: A Survey on Model Architectures, Pretraining Objectives, and Downstream Tasks. (arXiv:2201.09745v2 [cs.CL] UPDATED)
Jan. 28, 2022, 2:10 a.m. | Haoyu Dong, Zhoujun Cheng, Xinyi He, Mengyu Zhou, Anda Zhou, Fan Zhou, Ao Liu, Shi Han, Dongmei Zhang
cs.CL updates on arXiv.org arxiv.org
Since a vast number of tables can be easily collected from web pages,
spreadsheets, PDFs, and various other document types, a flurry of table
pre-training frameworks have been proposed following the success of text and
images, and they have achieved new state-of-the-arts on various tasks such as
table question answering, table type recognition, column relation
classification, table search, formula prediction, etc. To fully use the
supervision signals in unlabeled tables, a variety of pre-training objectives
have been designed and evaluated, …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Program Control Data Analyst
@ Ford Motor Company | Mexico
Vice President, Business Intelligence / Data & Analytics
@ AlphaSense | Remote - United States