- Tubelator AI
- >
- Videos
- >
- Film & Animation
- >
- (Spectral; Fall 23) 29 - A quick comment about interlacing and chromatic numbers
(Spectral; Fall 23) 29 - A quick comment about interlacing and chromatic numbers
Course site: https://www.stevebutler.org/spectral2023
Presenter: https://www.stevebutler.org/
In our last lecture we mention some interlacing results as well as talk about chromatic numbers.
Video Summary & Chapters
No chapters for this video generated yet.
Video Transcript
I do want to mention one thing that we haven't talked about, which is interlacing, because
if I don't say it, I feel like I will have done a disservice to the field.
We're not going to go through and do proofs.
Most of them rely on Rayleigh-Ritz type arguments and mins of maxes and all these weird things.
So we'll skip that and just sort of state some results and we'll try to do an application.
And I hope we'll succeed in the application.
But what's the idea of interlacing?
Well, the idea is we have a graph G of some sort.
Here's our graph G. And then we do something to it.
So maybe we just take a little piece of it out.
There's this vertex that used to be here,
and we've deleted it as an example of something
that we could do.
So we might have G minus a vertex.
x. Or maybe we could talk about g minus just a single edge. Now, once you do something
like this, you've changed your graph. And if you change your graph, you may, and usually
do, change your eigenvalues. And so there's the question, how are these related? Because
So, intuition says small changes should lead to small changes.
You know, it's kind of like calculus, right?
Make a small change, you're not going to go wildly different
unless you have some sort of discontinuity.
And so the question is, well, does that apply here?
And the answer is yes, and that's what interlacing is about.
And there's different matrices and different matrices have different properties.
And so let's talk about some of our favorites.
And so we'll start with the adjacency matrix.
And for the adjacency matrix, we'll compare our graph G, which will have eigenvalues lambda
1 up to lambda n.
and I should actually put these in order. So lambda 1 less than or equal to all the way up to lambda n.
And then we'll compare it with g minus a vertex v.
So this is muon all the way up to mu and minus 1.
And what happens? Well, if you think about the actual matrices of these,
How does the adjacency matrix for our graph G
relate to the adjacency matrix for G minus V?
So size-wise it's smaller.
What else happens though?
So you have your adjacency matrix for G, and you have
the vertex V is indexing a row and a column.
What do you do? Yeah, you delete this row
So, delete that column, and then life is good.
And so in particular, what happens is we end up with a principal sub matrix.
Now, this is because the adjacency matrix doesn't have to worry about degrees.
Most of our other matrices do.
And so, because you have this very nice principal sub matrix,
you get the following very nice result which says,
hey, your eigenvalues are going to interlace.
So lambda 1 less than or equal to mu 1 less than or equal to lambda 2 less than or equal
to mu 2 and so forth all the way down till you get to the end and mu n minus 1 is less
than or equal to lambda n.
All right.