April 25, 2024, 7:43 p.m. | Darui Lu, Yang Deng, Jordan M. Malof, Willie J. Padilla

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.15458v1 Announce Type: cross
Abstract: Large language models (LLMs) such as ChatGPT, Gemini, LlaMa, and Claude are trained on massive quantities of text parsed from the internet and have shown a remarkable ability to respond to complex prompts in a manner often indistinguishable from humans. We present a LLM fine-tuned on up to 40,000 data that can predict electromagnetic spectra over a range of frequencies given a text prompt that only specifies the metasurface geometry. Results are compared to conventional …

abstract arxiv chatgpt claude cs.lg gemini humans internet language language models large language large language models learn llama llms massive physics physics.optics prompts study text type

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US