Jan. 30, 2024, 1:21 p.m. | Karol Horosin

Hacker Noon - ai hackernoon.com

In this article, I take you through the process of deploying a smaller open-source Language Model (LLM) on AWS Lambda. The goal is to experiment with Microsoft Phi-2, a 2.7 billion parameter LLM, and explore its applications in scenarios like processing sensitive data or generating outputs in languages other than English. I walk you through setting up the environment, creating a Dockerized Lambda function, and deploying the LLM. Throughout the tutorial, we delve into performance metrics, cost considerations, and potential …

ai applications article aws aws lambda billion data docker english experiment explore lambda language language model language models languages llm llms microsoft phi phi-2 process processing python through

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US