11 - Bases of new vector spaces, Rank one matrices
All for 3x3 cases
- Basis - 9 dimensional (1 at each cell)
- Basis of symmetric matrices - 6 dimensional
- Basis of upper triangular matrices - 6 dimensional
- Basis of (Symmetric ∩ Upper triangular) - 3 dimensional (since diagonal)
-
Basis of (Symmetric ∪ Upper triangular) - 9 dimensional (since all 3x3)
- Every rank 1 matrix can be expressed as - u x vT
- where
u
andv
are column matrices
- where
- Let M be all 5x17 matrices with rank 4. Is M a subspace?
- No. Since no
0
matrix
- No. Since no
12 - Graphs and Networks, Incidence matrices, Kirchoff’s Law
Consider this directed graph -
- Here
m
= 5 edges,n
= 4 nodes -
A
- Incidence matrix can be used to denote this graph -
Let Null space solution of
A
bex
- Ax
denotes the Potential Difference between nodes- shows circuit movement only when potential difference
- Let
C
be a matrix which takes us from Potential Difference to Current through edges (Ohm’s law) C
- Conductance matrix-
Let current be
Y
- - ATy = 0 - Solving Left null space (Kirchoff’s current law)
- Solving left null space just means current coming
IN
a node is equal to current goingOUT
through it - can find basis by assuming current (
y
) through one edge and solving such that no charge accumulation - Dimension of Left Null Space = Number of independent loops
- Rank = Number of nodes - 1
- dim(N(AT)) = m - r
- #loops = #edges - (#nodes - 1)
- #nodes - #edges + #loops = 1 –>
Euler's Formula
- Solving left null space just means current coming
13 - Quiz Review
14 - Orthogonal vectors and subspaces
x
andy
are orthogonal if - xTy = 0- Two subspaces are orthogonal if every vector in subspace 1 is perpendicular to every other vector in subspace 2
- row space is perpendicular to null space
- column space is perpendicular to left null space
15 - Projections, Least squares, Projection matrix
- Why projections?
- Because
Ax = b
may not have a solution - projection means changing
b
to closest vector in column space ofA
- Ax^ = p
- Because
Projection for line
[Diagram]
- a ⊥ e
- aTe = 0
- aT(b - p) = 0
- aT(b - xa) = 0
- x.aT.a = aTb
- x = (aTb) / (aTa) - scalar
-
p = ax = a ((aTb) / (aTa)) - vector (closest
b
) - Since projection of
b
,p
is governed byb
i.e scalingb
also scalesp
- So we can write
p
as:- p = (a aT / (aTa)) b
- here (a aT / (aTa)) is known as Projection matrix (
P
)
- Column space of
P
- linea
- rank(P) = 1
- P = PT …. symmetrical
- P2 = P …. projecting same line twice, thrice will not change projection
Projection for matrix
[Diagram]
- a1Te = 0 = a2Te
- e = b - p = b - Ax^
- a1T(b - Ax^) = 0 = a2T(b - Ax^)
- Combining both equations (writing them in matrix form)
- AT(b - Ax^) = 0
- ATAx^ = ATb
- x = (ATA)-1ATb
- p = A(ATA)-1ATb
-
Here projection matrix
P
is A(ATA)-1AT - If
A
is square -P = I
…. since column space is whole space - P = PT …. symmetrical
- P2 = P …. projecting same line twice, thrice will not change projection
Application - least sqaure fitting by a line
- fit points on a straight line