Dec. 6, 2023, 6:22 p.m. | /u/SmeatSmeamen

Machine Learning www.reddit.com

So I have been reimplementing a VAE that has a recurrent latent distribution. After each new input, the latent distribution is updated using Gaussian multiplication of the encoder's output and the current latent distribution.

The model doesn't learn a good representation and I know that recent Dreamer models switched from continuous to categorical latent states which improved performance, so I want to give this a shot.

But, there's no Gaussian multiplication equivalent for categorical distributions. I have decided to take …

categorical current distribution encoder learn machinelearning normal vae

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Research Scientist

@ Meta | Menlo Park, CA

Principal Data Scientist

@ Mastercard | O'Fallon, Missouri (Main Campus)