all AI news
Can Large Language Models Learn the Physics of Metamaterials? An Empirical Study with ChatGPT
April 25, 2024, 7:43 p.m. | Darui Lu, Yang Deng, Jordan M. Malof, Willie J. Padilla
cs.LG updates on arXiv.org arxiv.org
Abstract: Large language models (LLMs) such as ChatGPT, Gemini, LlaMa, and Claude are trained on massive quantities of text parsed from the internet and have shown a remarkable ability to respond to complex prompts in a manner often indistinguishable from humans. We present a LLM fine-tuned on up to 40,000 data that can predict electromagnetic spectra over a range of frequencies given a text prompt that only specifies the metasurface geometry. Results are compared to conventional …
abstract arxiv chatgpt claude cs.lg gemini humans internet language language models large language large language models learn llama llms massive physics physics.optics prompts study text type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
DevOps Engineer (Data Team)
@ Reward Gateway | Sofia/Plovdiv