Feb. 11, 2024, 9:37 a.m. | /u/Substantial-Push-179

Machine Learning www.reddit.com

Hi everyone,

I'd like to share our project on open-type Named Entity Recognition (NER). Our model uses a transformer encoder (BERT-like), making the computation overhead very minimal compared to use of LLMs. I've developed a demo that runs on CPU on Google Colab.

Colab Demo: [https://colab.research.google.com/drive/1mhalKWzmfSTqMnR0wQBZvt9-ktTsATHB?usp=sharing](https://colab.research.google.com/drive/1mhalKWzmfSTqMnR0wQBZvt9-ktTsATHB?usp=sharing)

Code: [https://github.com/urchade/GLiNER](https://github.com/urchade/GLiNER)

Paper: [https://arxiv.org/abs/2311.08526](https://arxiv.org/abs/2311.08526)

bert colab computation cpu demo encoder google llms machinelearning making ner project recognition transformer transformer encoder type

Research Scholar (Technical Research)

@ Centre for the Governance of AI | Hybrid; Oxford, UK

HPC Engineer (x/f/m) - DACH

@ Meshcapade GmbH | Remote, Germany

Business Intelligence Analyst Lead

@ Zillow | Mexico City

Lead Data Engineer

@ Bristol Myers Squibb | Hyderabad

Big Data Solutions Architect

@ Databricks | Munich, Germany

Senior Data Scientist - Trendyol Seller

@ Trendyol | Istanbul (All)