all AI news
Oblivious subspace embeddings for compressed Tucker decompositions
June 14, 2024, 4:50 a.m. | Matthew Pietrosanu, Bei Jiang, Linglong Kong
stat.ML updates on arXiv.org arxiv.org
Abstract: Emphasis in the tensor literature on random embeddings (tools for low-distortion dimension reduction) for the canonical polyadic (CP) tensor decomposition has left analogous results for the more expressive Tucker decomposition comparatively lacking. This work establishes general Johnson-Lindenstrauss (JL) type guarantees for the estimation of Tucker decompositions when an oblivious random embedding is applied along each mode. When these embeddings are drawn from a JL-optimal family, the decomposition can be estimated within $\varepsilon$ relative error under …
abstract arxiv canonical embeddings general johnson literature low random results stat.co stat.me stat.ml tensor tools tucker type work
More from arxiv.org / stat.ML updates on arXiv.org
Proximal Interacting Particle Langevin Algorithms
3 days, 8 hours ago |
arxiv.org
Cluster Quilting: Spectral Clustering for Patchwork Learning
3 days, 8 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Senior Data Engineer
@ Displate | Warsaw
Content Designer
@ Glean | Palo Alto, CA
IT&D Data Solution Architect
@ Reckitt | Hyderabad, Telangana, IN, N/A
Python Developer
@ Riskinsight Consulting | Hyderabad, Telangana, India
Technical Lead (Java/Node.js)
@ LivePerson | Hyderabad, Telangana, India (Remote)
Backend Engineer - Senior and Mid-Level - Sydney Hybrid or AU remote
@ Displayr | Sydney, New South Wales, Australia