March 14, 2024, 4:48 a.m. | Carlo Nicolini, Jacopo Staiano, Bruno Lepri, Raffaele Marino

cs.CL updates on arXiv.org arxiv.org

arXiv:2403.08739v1 Announce Type: new
Abstract: A substantial gap persists in understanding the reasons behind the exceptional performance of the Transformer architecture in NLP. A particularly unexplored area involves the mechanistic description of how the distribution of parameters evolves over time during training. In this work we suggest that looking at the time evolution of the statistic distribution of model parameters, and specifically at bifurcation effects, can help understanding the model quality, potentially reducing training costs and evaluation efforts and empirically …

abstract architecture arxiv cond-mat.dis-nn cond-mat.stat-mech cs.ai cs.cl distribution dynamic gap language language models large language large language models nlp parameters performance training transformer transformer architecture type understanding work

Senior Data Engineer

@ Displate | Warsaw

Principal Software Engineer

@ Microsoft | Prague, Prague, Czech Republic

Sr. Global Reg. Affairs Manager

@ BASF | Research Triangle Park, NC, US, 27709-3528

Senior Robot Software Developer

@ OTTO Motors by Rockwell Automation | Kitchener, Ontario, Canada

Coop - Technical Service Hub Intern

@ Teradyne | Santiago de Queretaro, MX

Coop - Technical - Service Inside Sales Intern

@ Teradyne | Santiago de Queretaro, MX