April 16, 2024, 9 a.m. |

InfoWorld Machine Learning www.infoworld.com



In the past two years, I’ve been involved with generative AI projects using large language models (LLMs) more than traditional systems. I’ve become nostalgic for serverless cloud computing. Their applications range from enhancing conversational AI to providing complex analytical solutions across industries and many functions beyond that. Many enterprises deploy these models on cloud platforms because there is a ready-made ecosystem of public cloud providers and it’s the path of least resistance. However, it’s not cheap.

To read this article …

ai projects applications artificial intelligence become beyond cloud cloud computing cloud platforms computing conversational conversational ai deploy enterprises functions generative generative-ai industries language language models large language large language models llms platforms projects secrets serverless solutions systems

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Research Scientist (Computer Science)

@ Nanyang Technological University | NTU Main Campus, Singapore

Intern - Sales Data Management

@ Deliveroo | Dubai, UAE (Main Office)