June 28, 2022, 7:30 p.m. | Peter Barrett Bryan

Towards Data Science - Medium towardsdatascience.com

An intuitive basis for understanding all things “eigen”

Motivation

We often want to transform our data to reduce the number of features while preserving as much variance (i.e., the differences among our samples) as we can. Often, you’ll hear folks refer to principal component analysis (PCA) and singular value decomposition (SVD), but we can’t appreciate how these methods work without first understanding what eigenvectors and eigenvalues are.

Etymology

“Eigenvector” is a pretty weird word. As with many weird words (think …

computer science data science eigenvectors machine learning mathematics python understanding

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York