Add to Chrome

Log In

Sign Up

Try Gigabrain PRO

Supercharge your access to the collective wisdom of reddit, youtube, and more.
Learn More
Refine result by
Most Relevant
Most Recent
Most Upvotes
Filter by subreddit
r/math
r/learnmath
r/ElectricalEngineering
r/AskPhysics
r/explainlikeimfive
r/LinearAlgebra

Introduction to Eigenvalues and Eigenvectors

GigaBrain scanned 164 comments to find you 74 relevant comments from 9 relevant discussions.
Sort
Filter

Sources

In a practical sense, what are eigenvalues/eigenvectors?
r/math • 1
Eigenvalues and Eigenvectors in Linear Algebra
r/learnmath • 2
What are the applications of eigenvectors and eigenvalues?
r/ElectricalEngineering • 3
View All
6 more

TLDR

Summary

New

Chat with GigaBrain

What Redditors are Saying

Introduction to Eigenvalues and Eigenvectors

TL;DR Eigenvalues and eigenvectors are fundamental concepts in linear algebra used to understand transformations. Eigenvectors remain unchanged in direction during a transformation, while eigenvalues represent the factor by which they are scaled.

Understanding Eigenvalues and Eigenvectors

Eigenvectors are vectors that only change in magnitude (not direction) when a linear transformation is applied. The scalar value by which they are stretched or compressed is called an eigenvalue [1:1]. For example, if a matrix (A) transforms a vector (v) into (\lambda v), then (v) is an eigenvector of (A) with eigenvalue (\lambda) [5:9].

Applications in Diagonalization and Simplification

One of the key applications of eigenvalues and eigenvectors is in diagonalization. This process involves finding a basis of eigenvectors for a matrix, transforming it into a diagonal matrix where computations become simpler [1:1]. This technique is widely used in various fields such as physics, especially in simplifying problems involving inertia tensors and quantum mechanics [1:4][1:6].

Data Analysis and Principal Component Analysis (PCA)

In data science, eigenvalues and eigenvectors play a crucial role in techniques like Principal Component Analysis (PCA). PCA uses eigenvectors to identify directions of maximum variance in high-dimensional data, effectively reducing dimensionality while preserving important information [1:10][1:11]. This application highlights how eigenvectors form a new orthogonal basis for data analysis [2:2].

Dynamical Systems and Control Theory

Eigenvalues are essential in analyzing dynamical systems, where they determine modes of response such as resonant frequencies and rates of expansion or decay [3:3]. In control systems, understanding eigenvalues helps in identifying stable points and system behavior under various conditions [5:10].

Visualizing and Conceptualizing Eigenvectors

Visualizing eigenvectors can be challenging, but they can be thought of as axes that remain fixed during transformations, akin to rotating a book around a pencil placed on its cover [3:2]. Videos and resources like 3Blue1Brown provide intuitive explanations and visualizations that can aid in understanding these concepts [2:1][5:5].

Eigenvalues and eigenvectors are powerful tools across mathematics and applied sciences, offering insights into system stability, data structure, and more. Understanding their properties and applications can greatly enhance problem-solving skills in various domains.

See less

Helpful

Not helpful

You have reached the maximum number of searches allowed today.

Gigabrain for Chrome works on Bing too.

It's not just for google search! The Gigabrain extension can also bring you the most relevant and informative answers when you search on Bing.

Add to Chrome

Source Threads

POST SUMMARY • [1]

Summarize

In a practical sense, what are eigenvalues/eigenvectors?

Posted by mk7driver · in r/math · 3 years ago
72 upvotes on reddit
12 replies
Helpful
Not helpful
View Source
ORIGINAL POST

I'm fairly new to linear algebra, and I recently learned about eigenvectors and eigenvalues.

From my basic understanding, eigenvectors are vectors which change only by a scalar value (λ) after a linear transformation is applied to them. So if a linear transformation is applied to an entire subspace, there will be certain vectors in that subspace which do not move, but stay in place and only stretch by some scalar (λ), which could even be a scalar value of 1.

Firstly, I would like to ask, is my understanding of eigenvectors and eigenvalues correct?

And, if my understanding is correct, my follow up question would be: what are some practical scenarios where eigenvectors show up?

I've heard that eigenvectors show up often in areas such as quantum mechanics and electrical engineering, and I am curious in which instances would they show up. Like, what is the practical significance of a vector staying in place after a linear transformation is applied to it?

Thanks in advance to anyone who helps clear up my confusion.

12 replies
P
phdstruggs · 3 years ago

Maybe this is less practical but still an application of eigenvalues/vectors. If you have a data cloud ie multivariate data, you can compute a covariance matrix of the data.

That covariance matrix is positive definite so it will only have positive eigenvalues. If it’s full rank (which PD matrices are), you will have as many eigenvectors as your rank.

Since your eigenvectors are all orthogonal to each other, and you now have a new basis for your data cloud. Namely the one in which each basis vector is rotated in a way to maximise the variance along that basis vector. The variance along that one basis vector is your eigenvalue associated with that that particular basis eigenvector.

20 upvotes on reddit
R
Rioghasarig · 3 years ago

> That covariance matrix is positive definite so it will only have positive eigenvalues. If it’s full rank (which PD matrices are), you will have as many eigenvectors as your rank.

Just wanted to point out that the covariance matrix doesn't necessarily have to be full rank, technically. If the columns of your data are linearly dependent then the covariance matrix will not be full rank.

It doesn't have to be positive definite, it can be positive semi-definite.

6 upvotes on reddit
G
Geschichtsklitterung · 3 years ago

Quite practical, on the contrary: Principal Component Analysis, Karhunen-Loève Decomposition and all these dimensionality-reducing techniques so useful in data science, image processing, &c.

I use Mathematica's KL decomposition to convert my color pictures to B&W. 😎

3 upvotes on reddit
N
NewbornMuse · 3 years ago

For completeness' sake and anyone wishing to learn more: This technique is called Principal Component Analysis or PCA.

12 upvotes on reddit
[deleted] · 3 years ago

Let's see if I learned this correctly in my one semester of quantum mechanics [EDIT: I did not! See replies for the accurate answer]:

In the context of quantum mechanics, eigenvectors are the allowed solutions to Shroedinger's Equation for a given system. Shroedinger's Equation is a partial differential equation, so the eigenvectors are actually functions (and are thus called eigenfunctions). This is incredibly useful because it means we can use linear algebra techniques to attack problems in quantum mechanics.

5 upvotes on reddit
C
csappenf · 3 years ago

No. Schrodinger's equation is an equation, not an operator. It doesn't have eigenvalues/eigenvectors. We'll come back to this.

Quantum states themselves are (roughly speaking) vectors. Associated with each observable (something we can measure- position, momentum, energy, etc) is a Hermitian operator that we can apply to those vectors describing states. The states are vectors, so we can describe them using different bases. It turns out we can always write the state in an eigenbasis of, say, our position operator- all of the elements of our basis are eigenvectors of the position operator. When we do that, we say the state is in a superposition of position states. By the Born Rule, the coefficients of each basis vector are related to the probability of "observing" the thing who's state we are describing, in that position. We always "observe" eigenvalues (that's why our operators are Hermitian- so we know our eigenvalues are real.)

Back to where Schrodinger's equation comes in. Before we "observe" our thing, its state is evolving according to the SE. When we "observe" it, the state flips to the eigenvector associated with the eigenvalue we measured, rather than continuing to evolve according to SE. Why this happens is a mystery.

4 upvotes on reddit
A
Arcticcu · 3 years ago

One should be careful here, because in the Schrödinger equation the Hamiltonian can be infinite dimensional ("infinite matrix") and that complicates the meaning of the word "eigenvalue", making the whole business considerably more difficult mathematically. The idea still "just works", but a lot of assumptions are packed in to this line of thinking. See e.g. Quantum Measurement by Busch et al. to get a taste for the mathematical details if you're interested.

14 upvotes on reddit
B
BlueJaek · 3 years ago

As you probably know, one the best ways to understand a (finite dimensional) vector space is to view it as a collection of basis vectors. Now when we want to study the effects of a linear map from that vector space onto itself, it follows that the best way is to understand the linear map is to understand the effect of the linear map on the set of basis vectors. But, if you just have an arbitrary set of basis vectors, then the linear map will map each basis into a linear combination of all the basis vectors, i.e. Ae_j = a_1j*e_1 + a_2j*e_2 + ... +a_nj*e_n (p.s. you can follow this line of thinking to show all finite dimensional linear maps correspond to a matrix). While this can be sufficient for some purpose, the introduction of arbitrary coefficients can make it too difficult to understand certain behaviors. However, if you have a basis consisting of all genuine eigenvectors, then the linear map takes each basis vector to itself, i.e. Av_j = r_j v_j (where r_j is the eigenvalue). Now, if we take v to be some arbitrary vector, we have that v = c_1*v_ + ... + c_n*v_n and so Av = r_1c_1*v_1 + .... + r_nc_n*v_n (this is the same thing as diagonalization). One common application of this is that now we can clearly see that if each |r_j| < 1, then repeatedly application of A to any vector will tend towards zero, i.e. A^n*v tends to 0 as n tend to infinity.

24 upvotes on reddit
0
0_69314718056 · 3 years ago

I completely skipped over all of the notation and this was still the best explained comment I’ve seen. Thanks

Note: my skipping notation is in no way a reflection of anything other than how tired I am

6 upvotes on reddit
K
kieransquared1 · 3 years ago

Your understanding of eigenvectors seems solid to me. Probably the most concrete and useful application of them is in diagonalization: this is where you come up with a basis consisting of eigenvectors such that your linear transformation is a diagonal matrix with respect to that basis. Many problems in the sciences become a lot easier and/or clearer with this technique. For example, the inertial tensor is a 3x3 matrix describing a rigid body’s resistance to rotation about each axis. You’ll typically have off-diagonal terms which complicate things, but if you diagonalize it, the eigenvectors will be the principal axes - which often correspond to symmetries of the object - and the eigenvalues will be the rotational inertia about each of the principal axes.

47 upvotes on reddit
S
SpaceSpheres108 · 3 years ago

To add to this, diagonalization makes it far easier to compute powers of a matrix quickly. This allows you to (for example) easily find the nth term of some recursively defined sequences, like the Fibonacci sequence.

9 upvotes on reddit
G
gigadude · 3 years ago

Check out 3blue1brown. Finding the axis of rotation for a matrix is an example use-case.

10 upvotes on reddit
See 12 replies
r/learnmath • [2]

Summarize

Eigenvalues and Eigenvectors in Linear Algebra

Posted by MathNerd93 · in r/learnmath · 5 years ago

I know how to find them, but I have no idea what information they give, if that makes sense. I know that if A is any nxn matrix, then Ax=λx but for example if I know that some matrix has λ=2 , what does that tell me and why is it important?

2 upvotes on reddit
4 replies
Helpful
Not helpful
View Source
4 replies
Rotsike6 · 5 years ago

If you have an isomorphism of vector spaces expressed in matrix form, eigenvectors form a basis for both spaces, which is extremely useful.

1 upvotes on reddit
MathNerd93 · OP · 5 years ago

What is it useful for?

1 upvotes on reddit
Rotsike6 · 5 years ago

Take a differential equation: D(f)=0

For some D, e.g. D=d^2 /dt^2

Now take a basis for your domain e.g. sinus functions (the basis is dependent on de boundary conditions of your differential equation). Notice that the sinus functions are eigenvectors of d^2 /dt^2 . Now, if you have some initial conditions, you can express it into eigenvalues and transform it accordingly. d^2 /dt^2 is a rather easy operator, but google "Sturm Liouville problem" and you will see it can be done for different operators as well.

If you go into quantum mechanics, operators working on wave functions will collapse the wave function into an eigenvector. This means that, when doing quantum mechanics, everything is described as superpositions of eigenvectors of operators.

These are just two examples that come to mind for me, there are a lot more. If you go deeper into mathematics/physics you will notice their full power.

1 upvotes on reddit
_messyminded · 5 years ago

So I haven't watched his videos in ages, but 3Blue1Brown has made some amazingly helpful videos in my opinion.

Try this one if you have the time https://www.youtube.com/watch?v=PFDu9oVAE-g

If not, off the top of my head, the eigenvectors of a linear transformation are precisely the vectors that when transformed just get scaled (by a scale factor of the corresponding eigenvalue). They are of interest because of this reason... But they turn out to be important in a lot of ways. Diagnolisation jumps to the front of my mind, which can be used to find the power of linear transformation easily, for instance. Apologies for not having anything more to give at this time.

2 upvotes on reddit
See 4 replies
r/ElectricalEngineering • [3]

Summarize

What are the applications of eigenvectors and eigenvalues?

Posted by bossdaddo · in r/ElectricalEngineering · 2 years ago

While doing the linear algebra portion of my math classes this year, the concept that stuck out to me the most were eigenvalues and eigenvectors. A big reason as to why it stuck out to me was because I can't really think of any application for it. Like for instance, I can see how matrices and their operations are useful to set up simultaneous equations for circuit analysis, but what are eigenvectors and eigenvalues useful for?

Thanks in advance.

12 upvotes on reddit
8 replies
Helpful
Not helpful
View Source
8 replies
L
LiveAndDirwrecked · 2 years ago

The way I thought about it was picture the cover of a book laying flat. Now put a pencil on the book cover anywhere. Now rotate that book about the axis that is the pencils orientation.

The cover is a matrix with coordinates. As you rotate the book, all the coordinates of the matrix change except for the the axis that your pencil was on. Those coordinates stay the same. So you can think of that vector that is the axis of your book rotating as the eigenvector.

When you perform an operation on your matrix(rotate), some values were not affected.

27 upvotes on reddit
person930 · 2 years ago

Does Matrix A and transpose of matrix A has same eigenvector

2 upvotes on reddit
I
InstAndControl · 2 years ago

Probably only if the eigen-vector passes through the origin or something like that?

1 upvotes on reddit
bossdaddo · OP · 2 years ago

Wow that's a really good explanation, thanks

1 upvotes on reddit
blakehannaford · 2 years ago

Another application is in dynamical systems. Eigenvalues of the system matrix determine modes of response like resonant frequencies (complex eigenvalues), and exponential rates of expansion or decay (real eigenvalues). The eigenvectors determine which state variables (e.g velocities of individual masses) are involved in the mode associated with each eigenvalue.

5 upvotes on reddit
Q
Quatro_Leches · 2 years ago

really useful for control systems . but also a lot of other things. an application there is state transition matrix for dynamical systems.

12 upvotes on reddit
N0RMAL_WITH_A_JOB · 2 years ago

These are the solutions of numerous problems. Very important and fundamental. Control systems, communications, information theory…

They are answers.

1 upvotes on reddit
douggery · 2 years ago

Eigenvectors are the 'coordinate system' of a given matrix and the eigenvalues are the 'magnitudes' of each of the components of the coordinate system.

This helps me think about how the system has a natural decomposition into orthogonal parts so that you can change one of the parts without changing the others. Each eigenvalue is associated with an eigenvector and the largest eigenvalue-eigenvector pair has the maximum system output for a given input.

In a more abstract sense: the decomposition of the system into its base parts allows one to keep track of the minimum information of the system to understand its deterministic performance. This is generally useful in higher dimensions such as in principle component analysis which is able to remove low-eigenvalue components while retaining the majority of the model accuracy.

6 upvotes on reddit
See 8 replies
r/AskPhysics • [4]

Summarize

Eigenvectors and the essence of "eigen-things"

Posted by freefall_ragdoll · in r/AskPhysics · 2 years ago

I'd like to get an intuitive understanding of the real-world use of eigenvalues and eigenvectors and I also want to "spot" them in real life problems. Despite having read a lot of articles, reddit posts and having seen a lot of videos, I still have questions, because the usefulness of eigenstuff still does not viscerally "click" for me.

I understand how to blindly compute the eigenvectors of a matrix A and that they are the set of vectors which do not change direction when a linear transform (represented by a matrix A) is applied on them, but I have not internalized the usefulness of this property in real world. I want to be able to understand what makes this property so useful. I can see that they are a special property of linear transformation, but I'm unable to appreciate it.

What I don't understand is:

  1. Why is it even natural for linear transformations to have eigenvectors? What property of linear transformations do eigenvectors stem from? Are they some emergent (or latent, hidden) property of the constraints that define the whole class of linear transforms (and what is even a formal, algebraic definition the class of linear transforms?)? I guess I am looking for a similar derivation to how you can discover the structure of rotation matrices only from the definition of the class of rotation matrices (det(R) = 1 and RR^T = I) by taking the time derivative of those constraints.
  2. Can a NxN matrix have more than N eigenvectors with different directions and eigenvalues not equal to each other? Uniform-scaling matrices seem to have infinite number of eigenvectors in different directions, but the same value, which is equal to the scaling factor. But what about other matrices? Eigenvectors are described as the directions along which space stretches, so why a 2x2 matrix cannot have more than 2 unique eigenvectors? Is it because there is not enough degrees of freedom in a 2x2 matrix to encode more than 2 unique directions of stretching? Would we need an MxM matrix of rank N if we wanted to encode M unique eigenvectors into a linear transformation acting in an N-dimensional (sub)space? I can compose K 2x2 matrices, each of which stretches the space in a different direction, so why doesn't the final composed matrix have K unique eigenvalues?
  3. So far, I only saw eigenvectors and eigenvalues being talked about in the context of linear transformations, but are there any "eigen-things" related to non-linear transformations? Does it even make sense to talk about eigen-things outside of linear transforms?
  4. Complex eigenvectors and eigenvalues for rotation matrices (or at least for matrices containing "a bit" of rotation). What is the rationale? How can we visualize the complex eigenvectors of a 2x2 rotation matrix? Can I draw a diagram where I could see those complex eigenvectors? How did complex eigenvectors came to be? Are they just the result of algebraic symbol manipulation or do they have any tangible (visualizable) meaning?
  5. Are there any other representations of linear transformations, besides 2D matrices, in which eigenvectors are immediately visible? One can see the columns of a matrix as the basis vectors of a different coordinate system. Is there any structure other than a matrix (or a different interpretation of matrices) in which eigenvectors can be observed naturally?
  6. [Slightly off-topic question] What made you intuitively understand the usefulness of eigenvectors and its occurence in everyday situations?

​

Can you please help me find eigenvectors in these problems (if they are even there any)?

  1. Tearing a piece of paper after: does the path along which a paper gets torn have something to do with eigenvectors?
  2. Breaking a piece of glass: do eigenvectors have something to do with the lines along which the glass broken into pieces?
  3. Pushing toothpaste out of its casing: are eigenvectors somehow related to are the direction along which the toothpaste gets pushed out of the opening?

​

I'll be happy for any pointers or hints. Thank you!

39 upvotes on reddit
12 replies
Helpful
Not helpful
View Source
12 replies
D
drzowie · 2 years ago
  1. Eigenvectors/eigenvalues are a property of linear transformations (aka matrices), with application to the real world.

  2. No.

  3. Yes, there are generalizations of eigenthings to other kinds of transformation. But physics is all about linear things wherever possible, so understanding linear eigenthings is very important — nonlinear extensions less so.

  4. Others talked about this — but complex eigenvectors/eigenvalues exist for the same reason complex numbers do — to create a complete algebra (with solutions to all polynomials) you have to have complex numbers. Intuitively, complex numbers are all about rotations, so they “should” be connected to things that are rotation-like.

  5. Yes, see my example below.

  6. Normal mode theory, which is important to acoustics and mechanical engineering but has sadly fallen out of the standard undergrad physics curriculum.

Eigenvectors are easiest (for me) to understand in terms of resonant modes of physical objects. If you thunk a solid object (such as, say, a chunk of metal), it will respond with a very complex motion that is due to each part of the object following “f=ma=-kDx” (where “D” means “delta from current equilibrium position given all the other locations of all the other bits of the solid). That motion is complex and weird (in general) - think of how jello responds to being poked. If you take the vectors a and x, index them for all the pieces of solid, and make new vectors A and X that describe the complete state of the entire solid (and that have 3N dimensions), then you’re beginning to cook with gas. You can notice that K is a 3N by 3N dimensional matrix, and also notice that it has eigenvalues and Eigenvectors — so there are particular deformations of the solid (any solid, mind you) that act as oscillators — and by the completeness theorem, any deformation can be written as a sum of those special oscillating deformations!

That is gold, because it lets you solve for all sorts of motion of the solid without actually tracking how the jello-wobbles move around through it. You can separate the 3Nx3N problem into 3N separate 1-D linear problems that are separable one from the other. Those Eigenvectors are called “normal modes”.

The most obvious everyday experience of that is the bell-like sounds you can get from thunking many everyday objects. Bells, in particular, work by having a few Eigenvectors with similar eigenvalues, all widely separated in frequency from other Eigenvectors.

The air inside a flute or an ocarina works the same way: the instrument itself provides a boundary condition, and particular shapes of air deformation inside the instrument act as independent oscillators, which are well separated in frequency from other normal modes of the instrument.

So every time you hear music you are hearing an immediate practical application of Eigenvector theory.

Apologies for any typos — am in a phone at the moment.

11 upvotes on reddit
F
freefall_ragdoll · OP · 2 years ago

Those are really beautiful examples, I wish I could upvote more than once!

Do I understand correctly that it is enough to model the result of thunking of a piece of metal (or any other object) just by describing the local motion of the metal's atoms via linear transformations on micro-scale, combine all those linear equations (or displacements, accelerations, work, etc.?) that describe local motion, into a matrix and find the macro-scale effect by finding the eigenvectors and eigenvalues?

So the eigenvectors (together with eigenvalues) of a system of linear equations (?) describe the macro-scale ("big picture", resultant) effect of the combination (superposition) of each small force, that would otherwise not be obvious by simply looking at the big 3Nx3N matrix?

I feel like I'm close to an intuitive understanding, but I'm only missing a few concrete details (e.g. the exact physical quantities/units of the vectors/matrices you described): can I ask for more details please?

  1. "f=ma=-kDx": the "-kDx" term seems similar to Hooke's law F=-kx, where k is stiffness of the spring and x is the displacement vector. Why is there the "D" in the expression "f=ma=-kDx"? It feels redundant, because I would think that the vector x already means "displacement from equilibrium position", but I guess the D has something to do with the statement "…given all the other locations of all the other bits of the solid". What is D? Is it a matrix or a scalar?
  2. "You can notice that K is a 3N by 3N dimensional matrix,…": what exactly does the K matrix hold and how is it constructed (is it a well-known matrix in normal mode theory?)? My first guess would be K = A^(t)X, but I'm not sure about units (the units seem to be related to the work, but not quite?). For example, what does the element of K at row 3 and column 5 represent?

​

…But physics is all about linear things wherever possible…

Does this have something to do with local linearization of problems? Where can I learn more about this? It's fascinating, thanks for the answer, I really appreciate it!

1 upvotes on reddit
D
drzowie · 2 years ago

Hi, sorry I didn't see this right away.

Yes, you are exactly right about "it is enough to model the result of thunking of a piece of metal (or any other object) just by describing the local motion of the metal's atoms via linear transformations on microscale...".

The idea is that, near the static/equilibrium case, the object acts like a large collection of coupled, weighted springs. So in your (1) you noticed that I wrote down a vectorized version of Hooke's Law. The D part was because I was on my phone and couldn't write a Δ. I glossed over a step there, which turned out to be ill-advised since it confused you: if you're considering the motion of one little piece of stuff in the metal, then the "k" is really "k", a 3x3 matrix describing the resultant springlike force from displacement in a particular direction. k is normally diagonal, and often scalar -- but it can in principle be off-diagonal. Generalizing that to describing the whole system (your (2)), K is a 3N x 3N matrix, describing spring constants between every possible pair of little pieces of your overall solid. Each of the 9N^2 elements of K describes a notional spring connecting two parts of the solid. K is generally "block-diagonal" in the sense that distant pieces don't couple to each other directly but nearby ones do -- so most of the elements are 0. But it's not fully diagonal, in the sense that displacing a piece of the object applies force to other pieces of the object, as well as to itself. K has the same units as any other spring constant -- force per unit length.

So understanding the motion of a block of Jello or steel or anything else can be reduced to a single spring problem, with a humongous matrix in it instead of a scalar spring constant.

The eigenvectors of that humongous matrix are the "normal modes" of the system, and (to come back to the top), Yes, you can treat the system as a collection of completely independent oscillators, each of which describes displacements in the particular shape that is described by a single eigenvector of the K matrix. There are some neat demos of drumhead modes on Wikipedia, but every solid object can be treated the same way. For example, if you thunk one end of a tubular-bell wind chime you can imagine a displacement wave moving down the length of the chime and bouncing off the other end; or you can find the mix of standing-wave eigenmodes you excited when you hit the chime, and consider those modes interfering with each other as they each oscillate independently. Those two descriptions yield exactly the same motion of the wind chime.

2 upvotes on reddit
J
joshsoup · 2 years ago

These are all great questions. Have you seen 3b1b's video series on linear algebra? Many of these questions are answered there while accompanied by visualizations. It's a great supplement to classes and fills in a lot of conceptual holes that might be missed when first learning a subject. https://youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab

12 upvotes on reddit
F
freefall_ragdoll · OP · 2 years ago

Thank you, I have seen Grant's excellent tutorials on linear algebra (and almost all of his other great videos too). All of them were enlightening, except after watching the video about eigenvectors, I couldn't confidently say I really understood the topic to the point where I could see it's usefulness in practice

To put it in other words, I understood the contents of his eigenvectors video, but I did not internalize the concept of eigenvectors to the point where I could come up with them by myself and derive all of their properties (which has never happened to me after watching any other of Grant's videos on other topics).

Even after rewatching the whole 3b1b eigenvectors video, I could not find answer to any of my 5 questions mentioned above.

2 upvotes on reddit
J
joshsoup · 2 years ago

Sounds like you're on the right track.

I'll discuss linear operators and see if that helps make things click more. For the purposes here, we'll have our linear operator take in a vector and spit out a vector in the same space. This doesn't necessarily have to be the case, you can generalize the concept of linear operators, but let's just keep it simple. Let's look at a 2d vector space.

We can visualize an operator as how points move when put under this operation. So with every point you can either imagine it physically moving to it's new location, or you can imagine an arrow starting from it's point and stopping on the location where the operator maps it to. So you can imagine an infinite number of arrows, each starting at every possible point in the plane, all pointing to some other (or possibly the same) point.

However, for a linear operator, you can't just have this operation map points arbitrarily. A linear operator, L, follows two rules:

  1. L(av) = aL(v)

  2. L(v+u) = L(v) + L(u)

where v and u are vector and a is a scalar and L of course is our linear operator.

Now the first rule already helps to reduce all the possible functions. Every point in a straight line that passes through the origin will map to another (possibly the same) straight line that passes through the origin (unless if it happens to map to the zero vector, in that case each point will map to the zero vector). Moreover this mapping has to be smooth - i.e. as you vary the input vector, the output vector can't make any big jumps. It's also trivial to prove that every point on the line we are mapping to is visited by some point in the input line.

With this observation that a line has to map to a line, we can simplify our visualisation if we decide we don't care about the particulars of where on that line we are mapping to. Instead we can imagine every possible angle from 0 to 180 degrees defines a unique line. Each line will map to a line with a different (or the same) degree.

I assert now, that there are 4 possibilities.

  1. Every line maps to itself. This corresponds to any multiple of the identity operator. There are infinite eigenvectors that share one eigenvalue.

  2. No line maps to itself. This corresponds to a rotation. In this case eigenvalues are complex.

  3. One line maps to itself, the others do not. This corresponds to a repeated eigenvalue, but in this case it applies to one eigenvector, not all.

  4. Two different lines map to themselves, this corresponds to two different eigenvalues.

You might wonder why you can't have more eigenvectors than two. Here is a simple proof.

By way of contradiction assume there are at least 3 unique eigenvectors but not infinitely many. Now you can always write one eigenvector as the linear combination of the other two. Let's choose these vectors A, B, and C with corresponding eigenvalues a, b, and c such that C = A + B. Therefore we have:

L(C) = L(A + B)

cC = L(A) + L(B)

cC = aA + bB

Now since we already have C = A + B, the only way that last equation above is true is if a=b=c. You might want to visualize that geometrically. But in that case, we will then have infinitely many eigenvectors, since any arbitrary vector V can be written as a linear combination of A and B:

v = xA + yB

L(v) = L(xA + yB)

L(v) = axA + ayB

L(v) = a(xA + yB)

L(v) = aV.

This can be generalized to multiple dimensions.

As an interesting aside, when you have two coupled harmonic oscillators with the same resonance you can view their phases as a 2d vector space. You can then create a linear operator acting on the phases using the equations of motion. This system has two eigenvectors, when the two oscillators are in phase. And when the oscillators are out of perfectly out of phase. Therefore, oscillators in these initial conditions will remain in those conditions forever. But if they are in a different initial condition (such as one oscillator unexcited while the other is) then they will change over time. Look up coupled oscillators with linear algebra if interested. Steve Mould has a couple videos where he shows these, but doesn't get into the math. https://youtu.be/MUJmKl7QfDU

2 upvotes on reddit
R
RicciBoson · 2 years ago

For 4, in terms of real spaces the answer might help you understand the point of an eigen vector.

An eigen vector is a vector that stays "in place" during a transform. Only stretching or shrinking, but never being mapped onto another line. When you have complex eigen values, hence complex eigen vectors, there is NO vector which behaves this way under the transformation. Exactly analogous to how having complex solutions to x^2 + 1 =0 means there is NO real number that satisfies the equation. This makes sense intuitively for rotation matrices, because their job is to rotate the entire space around the origin by some angle. This means no vector can lay along its same original path, and so there had better not be any real eigen values.

18 upvotes on reddit
F
freefall_ragdoll · OP · 2 years ago

I am comfortable with visualizing complex numbers in 2D plane and I accept the fact that we invented the imaginary unit i as part of algebra, with the visual mening of adding a second numerical axis. I am comfortable with multiplication of a complex number by the imaginary unit being equal to rotation by 90°, because I can visually verify that the algebraic operation of multiplication equals to multiplication of scaled sine and cosine values. However, the problem for me is that I am unable to similarly visualize what a complex-valued matrix or a vector means visually. And when I cannot derive the algebra from geometry, it means I do not understand the topic.

I also understand that it is intuitive that there are no real eigenvectors for 2D rotation matrices (per your reasoning). However, what does not make sense for me visually, is where would you draw the complex-valued eigenvector in a 2D diagram? Would the complex value (the imaginary unit) protrude into the 3rd dimension, being orthogonal to the 2D plane (this would make sense, because the 2D plane does rotate along this orthogonal axis)?

I understand that complex numbers in eigenvectors and eigenvalues arise from finding the root of the characteristic polynomial, but this is just unintuitive algebra with no visual intuition. How can I visualize, draw and point my finger to the complex-valued vectors and complex-valued eigenvalues?

3 upvotes on reddit
L
LemonLimeNinja · 2 years ago

The complex '2D' plane is really a 4 dimensional space. We need real and imaginary numbers for both inputs and outputs. When you draw the complex 2D plane you're implicitly mapping the complex inputs to outputs on the same plane.

How can I visualize, draw and point my finger to the complex-valued vectors and complex-valued eigenvalues?

You can think of it as a vector embedded in a 4 dimensional space

4 upvotes on reddit
F
FoolishChemist · 2 years ago

I'd suggest the 3blue1brown video

https://www.youtube.com/watch?v=PFDu9oVAE-g

Also there is the whole Linear Algebra Essentials

https://www.youtube.com/watch?v=fNk_zzaMoSs&list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab

7 upvotes on reddit
F
freefall_ragdoll · OP · 2 years ago

Thanks, I have seen all of Grant's insightful videos on linear algebra, but eigenvectors were the only single one that I could not fully and confidently grasp. I have rewatched the video on eigenvectors before posting this question, just to make sure I didn't miss a potential explanation of my questions, but I could not find answer any of my 5 questions.

For example, he does not explain the visual meaning of complex eigenvectors (he only very briefly brushes on complex eigenvalues and presents them as solutions of the characteristic polynomial): https://youtu.be/PFDu9oVAE-g?t=650

5 upvotes on reddit
I
inventiveEngineering · 2 years ago

3blue1bronw's videos are mentioned here in some comments. Yet they still not answer OP's basic question of the real life implications of Eigen-things. 3b1b's videos are great, but there is still an explanatory gap in regard of solving everyday problems.

For me as a structural engineer I can say, that Eigenvalues are used to determine so called mode shapes of structures under dynamical loading. Applying an mathematical analysis to a structure under dynamic loading, i.e. horizontal soil movements (earthquake) allows you to determine the weak points of your structure and thus reinforce those critical points with additional reinforcement or structural steel.

An other application of Eigenvalues in structural engineering is buckling and stability calculations.

13 upvotes on reddit
See 12 replies
r/explainlikeimfive • [5]

Summarize

ELI5: What are Eigenvalues and Eigenvectors? What are they useful for?

Posted by [deleted] · in r/explainlikeimfive · 5 years ago

My linear algebra professor did a horrible job of explaining this in class and I’m very curious. Please help.

70 upvotes on reddit
12 replies
Helpful
Not helpful
View Source
12 replies
UntangledQubit · 5 years ago

Depends on what your vectors are, and what your matrix is doing to them.

In quantum mechanics, vectors are possible states of a particle. The matrices represent observation of a classical value (e.g. energy). Eigenvectors are thus the special states where the particle has a single well-defined value, while eigenvalues are the corresponding value.

In population dynamics, vectors are the population of various species, and matrices represent how those populations change over time. Eigenvectors are thus special mixtures of populations - either ones that the environment tends to over time, or tends away from over time. Eigenvalues tell you which it is, and how quickly it happens.

In the context of a covariance matrix, the eigenvectors give you the principle directions in which your dataset varies, and the eigenvalues tell you the degree of that variance. This is called PCA.

There are more, but hopefully this gives you an idea of how varied it is. A lot of things are linear, and when they're not we figure out how to linearize them, because the tools of linear algebra are just so powerful.

12 upvotes on reddit
L
LionSuneater · 5 years ago

Eigenvalues often tell you some characteristic properties of the system you've mathematically described. The associated eigenvector is a typical mode of that system. That is, the eigenvector is a fundamental way that system can behave. The true behavior of a system is then a combination of modes. That's really abstract, but that's why it pops up everywhere.

I'll give some physics examples, but linear algebra is applicable in many other domains. For a classical physics example, consider hitting a drum head. It'll vibrate at different frequencies (described by eigenvalues) in different vibrational patterns (described by eigenvectors). Most likely, though, the drum will be vibrating in some combination of these patterns. For a quantum physics example, an electron can be in one of many energy levels (described by eigenvectors) each corresponding to different energies (described by eigenvalues).

3 upvotes on reddit
snkn179 · 5 years ago

Matrices are like functions, you input a vector and you get out another vector. When you look at all possible vectors in space, you can think of a matrix as just something which is able to stretch and skew space, like in this diagram. Eigenvectors are just vectors that point along the same direction after being applied to the matrix. If you have a vector like [1, 2] and after you apply the matrix, it becomes [2, 4], then [1, 2] is an eigenvector since [2, 4] points in the same direction as [1, 2] (it's just double as long). If you find one vector that is an eigenvector, that means that all vectors along that same line are also eigenvectors as they're just scaled copies of the original eigenvector. This means that [2, 4] would also be an eigenvector, as would [5, 10] or [-3, -6], etc.

Eigenvalues just tell you how much an eigenvector is scaled when applied to a matrix. In our previous example, the eigenvector [1, 2] becomes stretched to [2, 4] which is a doubling in size, so the eigenvalue would be 2.

147 upvotes on reddit
harryham1 · 5 years ago

Is there any case scenario where, say [1, 2] returns [2, 4], but [2, 4] returns (as an example) [3, 3]?

Either way, what determines [1, 2] to be the relative vector for the eigenvalue? Is it like fractions, where you simplify to the smallest integers?

3 upvotes on reddit
[deleted] · 5 years ago

Usually one picks a convenient eigenvector, where convenient depends on what you want to do with it. Saying that there is an eigenvector for each eigenvalue is somewhat simplified: every multiple of an eigenvector is also an eigenvector. That means every eigenvalue has infinitely many eigenvectors. In more advanced mathematics we say that eigenvalues have associated eigenspaces.

The two standard ways are choosing the one with smallest positive coordinates that are still integers (in this example [1,2]) or choosing an eigenvector with length 1 (in this case [1/√3,2/√3]). You would also typically prefer an eigenvector with a positive first coordinate (so [1,2] is preferred over [-1,-2]).

1 upvotes on reddit
H
hippomancy · 5 years ago

So all of this math is about linear functions. That means f(ax)=af(x) for each vector x and scalar a. In English, if you stretch the input, you have to stretch the output by the same amount. You can make a function which works like that, it just wouldn’t be linear, and wouldn’t have eigenvectors in this sense.

1 upvotes on reddit
snkn179 · 5 years ago

All vectors along that line would have to be eigenvectors with the same eigenvalue. Let's use the previous example with [1, 2] as an eigenvector of a matrix (let's call M) with an eigenvalue of 2. This means that M × [1, 2] = 2 × [1, 2].

What happens when the vector we apply to M is a multiple of [1, 2]? (i.e. of the form k[1, 2])

M × (k[1, 2]) = k × (M × [1, 2])

= k × 2 × [1, 2]

= 2 × k[1, 2]

As can be seen, all vectors of the form k[1, 2] are scaled up by the eigenvalue two when the matrix is applied to them.

3 upvotes on reddit
Y
yammeringfistsofham · 5 years ago

I never really understood it at University either - my maths lecturer was terrible and I never understood a lot of things.

Now, at nearly 40 I'm picking up a lot of the pieces because of the 3blue1brown YouTube channel and Kahn academy.

This video really explains eigenvalues and eigenvectors in a way that got through to me:

https://youtu.be/PFDu9oVAE-g

U hope it works for you too...

17 upvotes on reddit
snkn179 · 5 years ago

Matrices and eigenvectors are actually fairly powerful tools in various areas of applied maths. Check out this video for some examples.

3 upvotes on reddit
1983amc · 5 years ago

On an even more general sense and eigen value is the "stretch" that an operator does to a vector. So say you have some vector of norm 1 and you apply an operator and afterwards it is norm 4. The eigenvalue associated with it would be 4.

1 upvotes on reddit
usernumber36 · 5 years ago

if you have a matrix M, its possible to multiply it by a vector v.

Lets say that

M x v is just some multiple of v. That means v is an eigenvector of the matrix M and the multiple is the eigenvalue.

For example, if

M x [ 1 , 2 ] = [ 3 , 6] = 3 x [1, 2 ]

that means [1 ,2 ] is an eigenvector of the matrix M and 3 is the corresponding eigenvalue

2 upvotes on reddit
B
black-gold-black · 5 years ago

As to usefulness. Many systems can be represented by matrices. Mechanical system's and a model of their dynamics, computer algorithms for findings solutions to equations, they are super important to machine learning. And the list goes on.

Bc eigenvectors don't change direction they are somewhat "stable" and if the eigen value is less than one there is also a stable point.

Understanding these eigenvectors can enable you to find "Stable" points of systems. Such as the minimums of complex functions or balance points of mechanical systems

13 upvotes on reddit
See 12 replies
r/learnmath • [6]

Summarize

Eigenvalues and Eigenvectors before Linear Algebra?

Posted by ObjectivePerceptor · in r/learnmath · 5 years ago

I’m taking a ODE class in the summer, and one of the prerequisites is linear algebra, and I have not taken that yet. However, I reached out to the professor, and he told me to self teach myself eigenvalues and eigenvectors and I should be good. How much linear algebra exposure do I need in order to understand eigenvalues and eigenvectors. What resources what you guys recommend?

2 upvotes on reddit
1 replies
Helpful
Not helpful
View Source
1 replies
[deleted] · 5 years ago

I think you need everything to properly understand eigenvectors and eigenvalues. (By everything I mean every chapter before eigenvalues and eigenvectors in a linear algebra book).

But you can easily learn the algorithms for finding eigenvalues and eigenvectors and get some kind of intuition. For the algorithms you can just google "how to find eigenvalues/eigenvectors". And for intuition you wan watch this series on linear algebra.

3 upvotes on reddit
See 1 replies
r/LinearAlgebra • [7]

Summarize

Diagonalizing matrices

Posted by JustiniR · in r/LinearAlgebra · 7 months ago

I’ve been searching for hours online and I still can’t find a digestible answer nor does my professor care to explain it simply enough so I’m hoping someone can help me here. To diagonalize a matrix, do you not just take the matrix, find its eigenvalues, and then put one eigenvalue in each column of the matrix?

12 upvotes on reddit
10 replies
Helpful
Not helpful
View Source
10 replies
R
Ron-Erez · 7 months ago

Not exactly. Not all matrices are diagonalizable. Yes, find all eigenvalues and their algebraic multiplicity. Next find a basis for each eigenspace of each of your eigenvalues. If the union of the basis you obtained has n vectors where n is the order of A then A is diagonalizable. One can rephrase this as follows. A matrix is diagonalizable if and only if the characteristic polynomial is a product of linear factors and for every eigenvalue the algebraic multiplicity equals the geometric multiplicity. I know this is overwhelming but I hope it helps at least a little.

1 upvotes on reddit
R
Ron-Erez · 7 months ago

By the way have a look at Section 9: Eigenvalues, Eigenvectors and Diagonalization the first seven lectures. I made it FREE to watch and it covers all of the concepts I mentioned. (It's part of a larger paid course but no need to pay to watch the videos I mentioned.)

Happy Linear Algebra!

1 upvotes on reddit
Accurate_Meringue514 · 7 months ago

Just to add, if you allow complex numbers, then you only need to worry about the dim of each eigenspace being the same as the multiplicity. Only over the reals you might run into that issue

2 upvotes on reddit
R
Ron-Erez · 7 months ago

Yes, that's absolutely correct. The complex numbers is the good life.

2 upvotes on reddit
InsensitiveClown · 7 months ago

A=PDP^-1, D is a diagonal matrix with eigenvalues in the diagonal. P is a matrix of column eigenvectors. If you have multiplicities in the eigenvalues, you may use general eigenvectors and use the Jordan form where the size of the Jordan block is the multiplicities. Not all matrices can be diagonalizable, but all can be made into Jordan form A=PJP^-1 if my memory serves me well. You want to read about similarity, similarity transformations, Jordan form, Jordan blocks, general eigenvectors.

1 upvotes on reddit
TheDuckGod01 · 7 months ago

To diagonalize a matrix A you need to first compute the eigenvalues and their associated eigenvectors.

Next, you take your eigenvalues and put them in a diagonal matrix D. That is, the diagonal entries of the matrix are exactly the eigenvalues.

After that you construct a matrix P whose column vectors are the eigenvectors to your eigenvalues, make sure they are aligned in the same order you aligned your eigenvalues.

Lastly you compute the inverse of P.

You then get D,P,P^-1 such that P^-1 AP = D or A = PDP^-1.

Something to note is you can arrange the eigenvalues however you like on the diagonal matrix D, just make sure your P matrix matches whatever order you choose.

Hope this helps!

3 upvotes on reddit
JustiniR · OP · 7 months ago

I’m slightly confused by the concept of the P matrix, if we have the diagonalized matrix once we get the eigenvalues why do we need to use the eigenvectors? Does it have anything to do with eigenbasis?

1 upvotes on reddit
TheDuckGod01 · 7 months ago

The eigenbasis is more to do with if it is possible to diagonalize a matrix. As Ron-Erez and Accurate_Meringue514 talk about in their comments, the eigenbasis and especially the dimensionality of it play a big part in determining if it is possible to perform diagonalization.

Once you determine it is possible to diagonalize the matrix A, D is in fact the result you want. However, you need a proper linear transformation to get there. That's where the P matrix comes in. It allows you to perform the transformation you need to get from matrix A to matrix D.

Without that transformation matrix P, A and D would be two fundamentally different matrices with no connection between them. P and D come as a package deal to make the diagonalization process on A.

Hope this helps!

1 upvotes on reddit
skay949 · 7 months ago

Not quite... you do not just put eigenvalues in a matrix - you need the eigenvectors too.

1 upvotes on reddit
finball07 · 7 months ago

Let's say your matrix represents a linear transformation T:V-->V, where V is a n-dimensional vector space. If you can find a basis of V whose elements are eigenvectors of T, then T is diagonalizable. In other words, the minimal polynomials of T splits, and each root of m_T has multiplicity 1, so T is diagonalizable.

Related: Look at this question and solution I proposed on mathstack exchange: https://math.stackexchange.com/questions/4902747/if-b3-b-is-b-diagonalizable

1 upvotes on reddit
See 10 replies
r/LinearAlgebra • [8]

Summarize

Eigenvector Basis - MIT OCW Help

Posted by Existing_Impress230 · in r/LinearAlgebra · 7 months ago

Hi all. Could someone help me understand what is happening from 46:55 of this video to the end of the lecture? Honestly, I just don't get it, and it doesn't seem that the textbook goes into too much depth on the subject either.

I understand how eigenvectors work in that A(x_n) = (λ_n)(x_n). I also know how to find change of basis matrices, with the columns of the matrix being the coordinates of the old basis vectors in the new basis. Additionally, I understand that for a particular transformation, the transformation matrices are similar and share eigenvalues.

But what is Prof. Strang saying here? In order to have a basis of eigenvectors, we need to have a matrix that those eigenvectors come from. Is he saying that for a particular transformation T(x) = Ax, we can change x to a basis of the eigenvectors of A, and then write the transformation as T(x') = Λx'?

I guess it's nice that the transformation matrix is diagonal in this case, but it seems like a lot more work to find the eigenvectors of A and do matrix multiplication than to just do the matrix multiplication in the first place. Perhaps he's just mentioning this to bolster the previously mentioned idea that transformation matrices in different bases are similar, and that the Λ is the most "perfect" similar matrix?

If anyone has guidance on this, I would appreciate it. Looking forward to closing out this course, and moving on to diffeq.

3 upvotes on reddit
5 replies
Helpful
Not helpful
View Source
5 replies
Accurate_Meringue514 · 7 months ago

He’s talking about what is the best basis to represent a linear transformation. So say you have some operator T, and you want to get the matrix representation of T with respect to some basis. The best basis to choose is the eigenvectors of T because the matrix representation is diagonal. So in that basis, A would be diagonal. He’s just saying that suppose there was some matrix A and those vectors happened to be the eigenvectors. Then performing the similarity transformation diagonalizes A.

2 upvotes on reddit
Existing_Impress230 · OP · 7 months ago

This makes sense.

If we had a transformation T = Ax, we could theoretically change everything to an eigenvector basis and in this case A would be diagonal. Or better yet, we’re already working in an eigenvector basis by either design or chance, and the calculations are easy.

I guess a potential application of this is if someone found themselves needing a transformation where the basis is arbitrary, and they have to communicate the “essence” of this transformation effectively. Perhaps it would be best to put everything in an eigenvector basis because the diagonalization kind of cleans up some of the matrix multiplication.

Not really at a point in my math career where non-standard bases clear things up, but I can see there being some utility under the right circumstances.

2 upvotes on reddit
Accurate_Meringue514 · 7 months ago

This has so many applications it’s not even funny. A couple are in differential equations where you have a system of coupled equations that you want to decouple. In quantum mechanics, youre trying to diagonalize the Hamiltonian to find the states. Now be careful, you can’t always make a matrix diagonal. Sometimes you just don’t have enough eigenvectors. Then you would need the notion of generalized eigenspaces and the Jordan form. Also, this is why symmetric matrices are important in practice, because they can always be diagonalized

2 upvotes on reddit
ken-v · 7 months ago

I’d say he’s pointing back to lecture 22 “diagonal action” and saying that an even better way to do image compression is to factor the image into S Lamda S-transpose from that chapter. Then you can use only the largest few eigenvalues and v_i to produce a compressed image which will be S’ Lambda’ S’-transpose where S’ contains just those few v-i and Lambda’ contains the few largest eigenvalues. Though, as he says, that isn’t practical in terms of compute-time. Does that make sense? What doesn’t make sense? Yes, this approach is not practical.

2 upvotes on reddit
Existing_Impress230 · OP · 7 months ago

This makes sense. Honestly, it’s basically what I thought.

He said it wasn’t practical for compression purposes, but I wasn’t sure if that meant it was supposed to be practical for other purposes, and I thought I might not be fully understanding since these other purposes weren’t obvious to me.

But now I see that this is just a convenient scenario; not something we’d generally strive to achieve when doing a transformations.

2 upvotes on reddit
See 5 replies
r/math • [9]

Summarize

Eigenvalues of a random (standard normal) matrix

Posted by node-342 · in r/math · 5 months ago

I am working slowly through a Udacity course on scientific programming in Python (instructed by Mike X Cohen). Slowly, because I keep getting sidetracked & digging deeper. Case in point:

The latest project is visualizing the eigenvalues of an m x m matrix of with elements drawn from the standard normal distribution. They are mostly complex, and mostly fall within the unit circle in the complex plane. Mostly:

https://preview.redd.it/ie0rmy5hbyye1.png?width=567&format=png&auto=webp&s=0a0f9647de2e84ccf29382fe2f0cab090a802fe9

The image is a plot of the eigenvalues of 1000 15 x 15 such matrices. The eigenvalues are mostly complex, but there is a very obvious line of pure real eigenvalues, which seem to follow a different, wider distribution than the rest. There is no such line of pure imaginary eigenvalues.

What's going on here? For background, I did physical sciences in college, not math, & have taken & used linear algebra, but not so much that I could deduce much beyond the expected values of all matrix elements is zero - and so presumably is the expected trace of these matrices.

...I just noticed the symmetry across the real axis, >!which I'd guess is from polynomials' complex roots coming in conjugate pairs. Since m is odd here, that means 7 conjugate pairs of eigenvalues and one pure real in each matrix.!< I guess I answered my question, but I post this anyway in case others find it interesting.

76 upvotes on reddit
10 replies
Helpful
Not helpful
View Source
10 replies
MOSFETBJT · 5 months ago

As you add dimensions, the spread the eigen Vals will approach a multi variate normal

A Rayleigh distribution might give you more intuition by the way

-9 upvotes on reddit
greangrip · 5 months ago

What do you mean by the spread of the eigenvalues will approach a normal distribution?

Just to clarify, I don't know how to interpret "spread" in a way that makes this correct. The eigenvalues will approach a uniform distribution on a disk. Not a normal distribution.

2 upvotes on reddit
MOSFETBJT · 5 months ago

TIL. I didn’t expect it to be circular uniform, I expect circular normal.

1 upvotes on reddit
DeGraaff · 5 months ago

What you see here is the circular law for GOE matrices, both is heavily studied! Even the joint law of all random eigenvalues are known, they are pfaffian point process and the picture you see can be understood analytically. As has been said already, due to the real entries, you have sqrt n many real eigenvalues and they behave quite differently from the complex ones. If you want to learn more about the circular law, I recommend https://arxiv.org/abs/1109.3343 (quite interesting history for instance!) see also here for the same picture: https://mathworld.wolfram.com/GirkosCircularLaw.html

26 upvotes on reddit
node-342 · OP · 5 months ago

Thank you, I will check those refs out.

5 upvotes on reddit
greangrip · 5 months ago

Your explanation is not quite correct. It's not just a symmetry thing. This is a phenomenon of random matrices with independent normally distributed entries. If you run it with an even dimension you'll still see this. It's known that the number of real eigenvalues is close to sqrt(2dimension/pi) with higher and higher probability as the size of the dimension increases. So what you're seeing is roughly 3 real eigenvalues per realization.

Edited because I forgot a constant.

66 upvotes on reddit
node-342 · OP · 5 months ago

Thank you - that is wild! I'll maybe be back with more questions after checking the refs the second commenter gave.

5 upvotes on reddit
greangrip · 5 months ago

Around the circular law is an essentially read for anyone interested in non-Hermitian random matrices but don't get too discouraged if you don't get much right away. It's not an intro to random matrices.

One wild thing is that while we expect the same number of real eigenvalues for other entry distributions, this is basically completely open. We can't even prove there are more than the trivial one with high probability.

6 upvotes on reddit
BorelMeasure · 5 months ago

this is a consequence of the circular law for the gaussian ensemble.

basically, consider the "spectral measure" of a scaled version of your matrices --- which is a suitable normalization of the sum of the dirac measures at the eigenvalues of the scaled version of your matrices. the circular law says that this, almost surely, will converge in the vague topology to the uniform distribution on the circle.

0 upvotes on reddit
[deleted] · 5 months ago

Check out Trefethan and Bau's Numerical Linear Algebra. https://people.maths.ox.ac.uk/trefethen/text.html there's a lecture and set of exercises that deal eith this directly!

2 upvotes on reddit
See 10 replies

Top People to Talk to

These are some of the top contributors related to
math
learnmath
ElectricalEngineering
2+ others
K

kieransquared1

No bio available
10585
Karma
28
Posts
1699
Comments
View Profile
Chat w/ Profile AI
_

_messyminded

No bio available
619
Karma
20
Posts
90
Comments
View Profile
Chat w/ Profile AI
R

Rotsike6

No bio available
64513
Karma
11
Posts
5294
Comments
View Profile
Chat w/ Profile AI
L

LiveAndDirwrecked

No bio available
9585
Karma
16
Posts
851
Comments
View Profile
Chat w/ Profile AI
R

RicciBoson

No bio available
231
Karma
6
Posts
31
Comments
View Profile
Chat w/ Profile AI
A

Accurate_Meringue514

No bio available
153
Karma
16
Posts
375
Comments
View Profile
Chat w/ Profile AI
S

snkn179

No bio available
63442
Karma
22
Posts
3513
Comments
View Profile
Chat w/ Profile AI
J

joshsoup

No bio available
3257
Karma
12
Posts
739
Comments
View Profile
Chat w/ Profile AI

Related

How to solve matrix equations

how to solve linear algebra problems

applications of matrix equations in engineering

numerical methods for solving matrix equations

AI Answer

🤖

introduction to eigenvalues and eigenvectors

Key Considerations for Eigenvalues and Eigenvectors:

  1. Definitions:

    • Eigenvalue (λ): A scalar that indicates how much an eigenvector is stretched or compressed during a linear transformation.
    • Eigenvector (v): A non-zero vector that changes only in scale (not direction) when a linear transformation is applied.
  2. Mathematical Representation:

    • The relationship is defined by the equation: [ A\mathbf{v} = \lambda \mathbf{v} ] where ( A ) is a square matrix, ( \mathbf{v} ) is the eigenvector, and ( \lambda ) is the eigenvalue.
  3. Finding Eigenvalues:

    • To find eigenvalues, solve the characteristic equation: [ \text{det}(A - \lambda I) = 0 ] where ( I ) is the identity matrix of the same size as ( A ).
  4. Finding Eigenvectors:

    • Once eigenvalues are found, substitute each eigenvalue back into the equation ( (A - \lambda I)\mathbf{v} = 0 ) to solve for the corresponding eigenvectors.
  5. Applications:

    • Eigenvalues and eigenvectors are used in various fields, including:
      • Physics: Stability analysis, quantum mechanics.
      • Computer Science: Principal Component Analysis (PCA), machine learning.
      • Engineering: Vibration analysis, control systems.

Takeaways:

  • Understanding eigenvalues and eigenvectors is crucial for linear algebra applications.
  • They provide insight into the properties of linear transformations and matrices.
  • Mastering the computation of eigenvalues and eigenvectors is essential for advanced studies in mathematics and applied fields.

Recommendation: Start with small matrices to practice finding eigenvalues and eigenvectors manually. This will help solidify your understanding before moving on to larger matrices or applications in software tools like MATLAB or Python's NumPy library.

Still looking for a better answer?

Get more comprehensive results summarized by our most cutting edge AI model. Plus deep Youtube search.

Try Gigabrain Pro for Free
gigaGigaBrain Logo
Support

Who are we?

Get API access

Leave us feedback

Contact us

Legal

Terms of Use

Privacy Policy

Shopping Tools

Product Comparisons

2023 GigaBrain Corporation
As an Amazon Associate, GigaBrain may earn a commission from qualifying purchases.