Feb. 12, 2024, 3:08 p.m. | /u/graphbook

Machine Learning www.reddit.com



https://preview.redd.it/g4oh19wy56ic1.png?width=5096&format=png&auto=webp&s=fd1fa1350f86281ba56c6daa1609b410f5120c71

[Project](https://cerbrec.com) **Origin**:

My colleague and I were MLEs/ applied researchers and were constantly annoyed at trying to troubleshoot and customize transformers in production NLP use cases. This was starting from when BERT came out on Tensorflow1 and you couldn’t really step through a model at all. Just to clarify, of course considerable effort is spent purely cleaning data, but we found that we could do much better understanding and fixing problems by digging into model architecture as well. …

applied research bert cases compute edit feedback interactive machinelearning nlp platform production research researchers transformer transformer models transformers use cases

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Research Scholar (Technical Research)

@ Centre for the Governance of AI | Hybrid; Oxford, UK

HPC Engineer (x/f/m) - DACH

@ Meshcapade GmbH | Remote, Germany

ETL Developer

@ Gainwell Technologies | Bengaluru, KA, IN, 560100

Medical Radiation Technologist, Breast Imaging

@ University Health Network | Toronto, ON, Canada

Data Scientist

@ PayPal | USA - Texas - Austin - Corp - Alterra Pkwy