Eigenvalues and eigenvectors facts for kids
Linear algebra talks about types of functions called transformations. In that context, an eigenvector is a vector—different from the null vector—which does not change direction after the transformation (except if the transformation turns the vector to the opposite direction). The vector may change its length, or become zero ("null"). The eigenvalue is the value of the vector's change in length, and is typically denoted by the symbol . The word "eigen" is a German word, which means "own" or "typical".
Contents
Basics
If there exists a square matrix called A, a scalar λ, and a nonzero vector v, then λ is the eigenvalue and v is the eigenvector if the following equation is satisfied:
In other words, if matrix A times the vector v is equal to the scalar λ times the vector v, then λ is the eigenvalue of v, where v is the eigenvector.
An eigenspace of A is the set of all eigenvectors with the same eigenvalue together with the zero vector. However, the zero vector is not an eigenvector.
These ideas often are extended to more general situations, where scalars are elements of any field, vectors are elements of any vector space, and linear transformations may or may not be represented by matrix multiplication. For example, instead of real numbers, scalars may be complex numbers; instead of arrows, vectors may be functions or frequencies; instead of matrix multiplication, linear transformations may be operators such as the derivative from calculus. These are only a few of countless examples where eigenvectors and eigenvalues are important.
In cases like these, the idea of direction loses its ordinary meaning, and has a more abstract definition instead. But even in this case, if that abstract direction is unchanged by a given linear transformation, the prefix "eigen" is used, as in eigenfunction, eigenmode, eigenface, eigenstate, and eigenfrequency.
Eigenvalues and eigenvectors have many applications in both pure and applied mathematics. They are used in matrix factorization, quantum mechanics, facial recognition systems, and many other areas.
Example
For the matrix A
the vector
is an eigenvector with eigenvalue 1. Indeed, one can verify that:
On the other hand the vector
is not an eigenvector, since
and this vector is not a multiple of the original vector x.
Related pages
 .
 .
 .
 .
 .
 .
 .
 .
 .
 .
 . ISBN: 0521305861 (hardback), ISBN: 0521386322 (paperback)
 .
 .
 .
 .
 .
 .
 .
 .
 .
 .
 .
 .
 .
 .
 .
 .
 Pigolkina, T. S. and Shulman, V. S., Eigenvalue (in Russian), In:Vinogradov, I. M. (Ed.), Mathematical Encyclopedia, Vol. 5, Soviet Encyclopedia, Moscow, 1977.
 .
 .
 Curtis, Charles W., Linear Algebra: An Introductory Approach, 347 p., Springer; 4th ed. 1984. Corr. 7th printing edition (August 19, 1999), ISBN: 0387909923.
 .
 .
 .
 Theory
 Online calculators
Images for kids

The wavefunctions associated with the bound states of an electron in a hydrogen atom can be seen as the eigenvectors of the hydrogen atom Hamiltonian as well as of the angular momentum operator. They are associated with eigenvalues interpreted as their energies (increasing downward: n = 1,\, 2,\, 3,\, \ldots) and angular momentum (increasing across: s, p, d, ...). The illustration shows the square of the absolute value of the wavefunctions. Brighter areas correspond to higher probability density for a position measurement. The center of each figure is the atomic nucleus, a proton.

PCA of the multivariate Gaussian distribution centered at (1, 3) with a standard deviation of 3 in roughly the (0.878, 0.478) direction and of 1 in the orthogonal direction. The vectors shown are unit eigenvectors of the (symmetric, positivesemidefinite) covariance matrix scaled by the square root of the corresponding eigenvalue. (Just as in the onedimensional case, the square root is taken because the standard deviation is more readily visualized than the variance.