Sept. 25, 2023, 12:15 a.m. | /u/mtnwrw

Machine Learning www.reddit.com

I created an OpenGL/OpenGLES based inference framework a while back which is rather GPU-agnostic and might be a good option for distributing multi-platform ML solutions for platforms ranging from Android over desktop to WebGL(2). Quite recently I added support for LLMs to that (restricted to 4-bit quantized Llama models for now).

The LLM-enabled fork can be found [here](https://github.com/mtnwrw/fyusenet) (compileable sample code inside).

Maybe someone finds this useful. Also looking for collaborators to extend the functionality.

android framework good gpu inference llama llama models llms machinelearning platform platforms solutions support webgl

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US