Jan. 12, 2022, 2:10 a.m. | Mohammadreza Momenifar, Enmao Diao, Vahid Tarokh, Andrew D. Bragg

cs.LG updates on arXiv.org arxiv.org

Analyzing large-scale data from simulations of turbulent flows is memory
intensive, requiring significant resources. This major challenge highlights the
need for data compression techniques. In this study, we apply a
physics-informed Deep Learning technique based on vector quantization to
generate a discrete, low-dimensional representation of data from simulations of
three-dimensional turbulent flows. The deep learning framework is composed of
convolutional layers and incorporates physical constraints on the flow, such as
preserving incompressibility and global statistical characteristics of the
velocity gradients. …

arxiv autoencoder compression data physics

Data Scientist (m/f/x/d)

@ Symanto Research GmbH & Co. KG | Spain, Germany

Enterprise Data Quality, Senior Analyst

@ Toyota North America | Plano

Data Analyst & Audit Management Software (AMS) Coordinator

@ World Vision | Philippines - Home Working

Product Manager Power BI Platform Tech I&E Operational Insights

@ ING | HBP (Amsterdam - Haarlerbergpark)

Sr. Director, Software Engineering, Clinical Data Strategy

@ Moderna | USA-Washington-Seattle-1099 Stewart Street

Data Engineer (Data as a Service)

@ Xplor | Atlanta, GA, United States