Oct. 10, 2022, 1:16 a.m. | Jianyi Zhang, Yiran Chen, Jianshu Chen

cs.CL updates on arXiv.org arxiv.org

Developing neural architectures that are capable of logical reasoning has
become increasingly important for a wide range of applications (e.g., natural
language processing). Towards this grand objective, we propose a symbolic
reasoning architecture that chains many join operators together to model output
logical expressions. In particular, we demonstrate that such an ensemble of
join-chains can express a broad subset of ''tree-structured'' first-order
logical expressions, named FOET, which is particularly useful for modeling
natural languages. To endow it with differentiable learning …

arxiv attention head join multi-head multi-head attention network reasoning transformer

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Software Engineer, Generative AI (C++)

@ SoundHound Inc. | Toronto, Canada