all AI news
[P] MazeGPT - Transformer based maze generator
Oct. 5, 2023, 9:06 p.m. | /u/noah-hein
Machine Learning www.reddit.com
I recently did a summer research project implementing GPT-2 to generate mazes.
The core concept of the model is to combine a bunch of popular maze generation algorithms into one. The goal was that the transformer will be able to identify key components using self-attention and piece together different algorithms. Most maze generation algorithms result in almost a finger print (like in chaos theory). The end goal was to mimic a higher degree of randomness / make the …
algorithms attention components concept core generate generator gpt gpt-2 hello identify machinelearning popular project research self-attention together transformer
More from www.reddit.com / Machine Learning
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne