May 2, 2024, 8:44 p.m. | Allen Institute for AI

Allen Institute for AI www.youtube.com

Abstract: Large language models are good at learning semantic latent spaces, and the resulting contextual embeddings from these models serve as powerful representations of information. In this talk, I present two novel uses of semantic distances in these latent spaces. In the first part, I introduce BERTScore, an algorithm designed to measure the similarity between machine translation outputs and human gold standards. BERTScore approximates a form of transport distance to match tokens in the generated and human text. In the …

abstract algorithm embeddings good information language language models large language large language models machine novel optimization part semantic serve spaces talk

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US