April 8, 2024, 9:10 a.m. | /u/MrCatberry

Deep Learning www.reddit.com

Hi guys!

I need some input on how to work around this:

I'm trying to create a Real-ESRGAN Compact upscaling model with NeoSR, but my dataset it quite too big:

\~28mio 256x256 pictures as gt and the same amount as 128x128 lq for a 2x upscaling model.

This dataset has about 600GB - i tried both io options: disk and lmdb
But i always get an memory error, not VRAM, but normal memory - it seems to try to load …

big compact dataset deeplearning upscaling work

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US