all AI news
Dataset too big
April 8, 2024, 9:10 a.m. | /u/MrCatberry
Deep Learning www.reddit.com
I need some input on how to work around this:
I'm trying to create a Real-ESRGAN Compact upscaling model with NeoSR, but my dataset it quite too big:
\~28mio 256x256 pictures as gt and the same amount as 128x128 lq for a 2x upscaling model.
This dataset has about 600GB - i tried both io options: disk and lmdb
But i always get an memory error, not VRAM, but normal memory - it seems to try to load …
More from www.reddit.com / Deep Learning
What amount of data makes up a tensor?
1 day, 11 hours ago |
www.reddit.com
Why does IA still struggle with colorization of old movies.
2 days, 15 hours ago |
www.reddit.com
how to utilize my time?
2 days, 21 hours ago |
www.reddit.com
Training an Small Language Model
3 days, 1 hour ago |
www.reddit.com
Jobs in AI, ML, Big Data
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US