0% found this document useful (0 votes)
12 views35 pages

Linear Algebra

1) The identity matrix I has 1s on its main diagonal and 0s elsewhere. It is the neutral element of matrix multiplication. 2) If you transpose a matrix product, you multiply the transposed matrices in reverse order. 3) Two vectors are linearly independent if the only way they can have a linear relationship is with trivial weights of 0. There must not be a way to express one vector as a scaled version of the other.

Uploaded by

vymuwuma
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
0% found this document useful (0 votes)
12 views35 pages

Linear Algebra

1) The identity matrix I has 1s on its main diagonal and 0s elsewhere. It is the neutral element of matrix multiplication. 2) If you transpose a matrix product, you multiply the transposed matrices in reverse order. 3) Two vectors are linearly independent if the only way they can have a linear relationship is with trivial weights of 0. There must not be a way to express one vector as a scaled version of the other.

Uploaded by

vymuwuma
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 35

LINEAR ALGEBRA (REVIEW)

Andrea Tarelli - SBFA, UCSC 8


I IS THE NEUTRAL ELEMENT OF THE MULTIPLICATION

I The n n identity matrix I has 1s down the main diagonal and 0s elsewhere.
The 2 2 identity matrix, for example, is

" #
1 0
I = ,
0 1
2 2

I I is the neutral element of the matrix multiplication:

" # " # " #


1 0 3 4 0 3 4 0
= ,
0 1 5 2 2 5 2 2
2 2 2 3 2 3

2 3 2 3
5 1 " # 5 1
6 7 1 0 6 7
4 8 3 5 = 4 8 3 5 .
0 1
1 0 3 2
2 2 1 0 3 2

Andrea Tarelli - SBFA, UCSC 9


TRANSPOSITION AND MULTIPLICATION

I If you transpose a matrix product you multiply the transposed matrices in


reverse order:

0 1T
b
A z }| {
Bz }| { C
B 2 3 C 0" # 1T
B " # C
B 1 C 5
B 1 2 0 6 7 C = @ A ,
B 4 2 5 C 4
B 1 1 1 C 2 1
B 2 3 1 C
@ 3 1 A

AT
z }| {
bT
z }| { 2 3
1 1 h i
h i
6 7 = 5 4 .
1 2 1 4 2 1 5 1 2
1 3
0 1 3 2

Andrea Tarelli - SBFA, UCSC 10


LINEAR INDEPENDENCE 1

" # " #
1 2
I The vectors and are linearly independent if an only if (i )
2 3
the linear relationship

" # " # " # " # " # " # !


1 2 0 1 2 x1 0
x1 + x2 = () =
2 3 0 2 3 x2 0

is true only for the trivial weights

x1 = x2 = 0 .

Andrea Tarelli - SBFA, UCSC 11


LINEAR INDEPENDENCE 2

I Stating that one of the vectors is a scaled version of the other is impossible:

" # " #
x1 1 2
= (impossible).
x2 2 3

Andrea Tarelli - SBFA, UCSC 12


LINEAR INDEPENDENCE 3

I There is linear independence because there exists the inverse matrix

" # 1 " # !
1 2 3 2
= , see later in the slides
2 3 2 1

such that

" # 1 " # " #


1 2 1 2 1 0
= .
2 3 2 3 0 1

Andrea Tarelli - SBFA, UCSC 13


LINEAR INDEPENDENCE 4

" #" # " # " # 1


1 2 x1 0 1 2
I If you premultiply both sides of = with ,
2 3 x2 0 2 3
you get

" # 1 " # " # " # 1 " #


1 2 1 2 x1 1 2 0
=
2 3 2 3 x2 2 3 0

" # " # " # " # " #


1 0 x1 0 x1 0
= () = :
0 1 x2 0 x2 0

Andrea Tarelli - SBFA, UCSC 14


CONSTRUCTING THE INVERSE MATRIX 1

" # 1 " # !T
1 2 1 1 2
= " # ! matrix of cofactors of
2 3 1 2 2 3
det
2 3
| {z }
= 1

Andrea Tarelli - SBFA, UCSC 15


CONSTRUCTING THE INVERSE MATRIX 2

" #
1 2
I Notice that is invertible if and only if (i ) it is non-singular, that
2 3
is, its determinant is non-zero:

" # !
1 2
det = 1 ( 1)1+1 det (3) +
2 3 | {z }
the 1-1 cofactor

2 ( 1)2+1 det (2)


| {z }
the 2-1 cofactor

= 1 3 + 2 ( 2) = 1 .

Andrea Tarelli - SBFA, UCSC 16


CONSTRUCTING THE INVERSE MATRIX 3

2 3
6 7
6 ( 1)1+1 det (3) ( 1)1+2 det (2) 7
" # 6 | {z } | {z } 7
6 the 1-1 cofactor the 1-2 cofactor 7
1 2 6 7
matrix of cofactors of = 6
6
7
7
2 3 6 7
6 ( 1)2+1 det (2) ( 1)2+2 det (1) 7
6 | {z } | {z } 7
4 the 2-1 cofactor the 2-2 cofactor 5

2 3
3 2
6 7
= 4 5
2 1

Andrea Tarelli - SBFA, UCSC 17


CONSTRUCTING THE INVERSE MATRIX 4

" # 1 " #T
1 2 1 3 2
=
2 3 1 2 1

" #
3 2
=
2 1

I Do not forget the transposition of the matrix of cofactors!

Andrea Tarelli - SBFA, UCSC 18


A FORMULA FOR INVERTING 2 2 MATRICES

I Given ad cb 6= 0, consider the square matrix

" #
a b
.
c d

I Its inverse is

" #
1 d b
.
ad cb c a

Andrea Tarelli - SBFA, UCSC 19


INVERSION AND TRANSPOSITION ARE INTERCHANGEABLE

I An example is

0 " # 1T 0 " #T 1 1
1
@ 2 0 A @ 2 0 A
=
1 3 1 3

2 3
1 1
6 2 6 7
= 6 7
4 5
0 1
3

Andrea Tarelli - SBFA, UCSC 20


LINEAR INDEPENDENCE, INVERTIBILITY, AND FULL RANK

The n columns of an n n matrix are linearly independent

The n n matrix is invertible

The n n matrix has full rank (n)

I The rank of a matrix is the highest order of non-singular square submatrices.

Andrea Tarelli - SBFA, UCSC 21


INVERTING A 3 3 MATRIX

I Show that

2 3 1 2 3
1 2 3 0 5 5
6 7 6 7
6 7 6 7
6 7 1 6 7
6 2 3 2 7 = 6 2 7 4 7 .
6 7 6 7
6 7 5 6 7
4 5 4 5
3 3 2 3 3 1

Andrea Tarelli - SBFA, UCSC 22


DETERMINANT OF A 3 3 MATRIX

I To compute the determinant of a 3 3 matrix, you can use the Sarrus' rule:

2 3
a a12 a13
6 11 7
6 7
6 7
det 6
6 a21 a22 a23 7
7
6 7
4 5
a31 a32 a33

= a11a22a33 + a12a23a31 + a13a21a32 a31a22a13 a32a23a11 a33a21a12

Andrea Tarelli - SBFA, UCSC 23


LINEAR INDEPENDENCE 5

2 3 2 3 2 3
1 2 3
6 7 6 7 6 7
I The vectors 4 2 5, 4 3 5, and 4 2 5 are linearly independent:
3 3 2

2 3 2 3 2 3 2 3
1 2 3 0
6 7 6 7 6 7 6 7
x1 4 2 5 + x2 4 3 5 + x 3 4 2 5 = 4 0 5
3 3 2 0

is true only for the trivial weights

x1 = x2 = x3 = 0 .

Andrea Tarelli - SBFA, UCSC 24


LINEAR INDEPENDENCE 6

I Stating that one of the vectors is a linear combination of the others is


impossible:

2 3 2 3 2 3
1 2 3
x1 6 7 x2 6 7 6 7
4 2 5 + 4 3 5 = 4 2 5 (impossible).
x3 3 x3 3 2

Andrea Tarelli - SBFA, UCSC 25


SYSTEMS OF LINEAR EQUATIONS (REVIEW)

Andrea Tarelli - SBFA, UCSC 26


A FIRST EXAMPLE 1

I Consider the following linear system in the unknowns x1, x2, and x3 :

8
>
> x 1 + 2 x2 + 3 x3 = 6
>
>
>
>
<
2 x 1 + 3 x 2 + 2 x3 = 1
>
>
>
>
>
>
: 3x + 3 x + 2 x = 0
1 2 3

I We can rewrite it as follows:

2 32 3 2 3
1 2 3 x1 6
6 76 7 6 7
4 2 3 2 5 4 x2 5 = 4 1 5
3 3 2 x3 0

Andrea Tarelli - SBFA, UCSC 27


A FIRST EXAMPLE 2

I We can also rewrite the system as a `constrained' linear combination:

2 3 2 3 2 3 2 3
1 2 3 6
6 7 6 7 6 7 6 7
x1 4 2 5 + x 2 4 3 5 + x 3 4 2 5 = 4 1 5
3 3 2 0

Andrea Tarelli - SBFA, UCSC 28


A FIRST EXAMPLE 3

I The system admits a solution i the vector

2 3
6
6 7
4 1 5
0

can be seen as a linear combination of the vectors

2 3 2 3 2 3
1 2 3
6 7 6 7 6 7
4 2 5 , 4 3 5 , and 4 2 5
3 3 2

with proper weights x1, x2, and x3.

Andrea Tarelli - SBFA, UCSC 29


A FIRST EXAMPLE 4

I This can happen i (Rouche-Capelli Theorem)

2 3
1 2 3
6 7
the number of independent columns of 4 2 3 2 5
3 3 2

equals

2 3
1 2 3 6
6 7
the number of independent columns of 4 2 3 2 1 5 .
3 3 2 0
| {z }
complete matrix

Andrea Tarelli - SBFA, UCSC 30


A FIRST EXAMPLE 5

I The rank of a matrix measures how many indipendent columns the matrix
has.

I The rank of a matrix is the highest order of non-singular square submatrices.

2 3 2 3
1 2 3 1 2 3 6
6 7 6 7
4 2 3 2 5 is non-singular so that 4 2 3 2 1 5 has maximum rank (3) .
3 3 2 3 3 2 0
| {z }
complete matrix

Andrea Tarelli - SBFA, UCSC 31


A FIRST EXAMPLE 6

I We can work out the solution x1, x2, and x3 by means of the Cramer's
Theorem:

02 31
6 2 3 5
B6
x1 = det @4 1 3 27C
5A = ( 5) = =1 ;
0 3 2 5

02 31
1 6 3 19
B6
x2 = det @4 2 1 27C
5A = ( 5) = = 3:8 ;
3 0 2 5

02 31
1 2 6 21
B6
x3 = det @4 2 3 17C
5A = ( 5) = = 4 :2 :
3 3 0 5

Andrea Tarelli - SBFA, UCSC 32


A FIRST EXAMPLE 7

I We can work out x1, x2, and x3 also by means of inversion (given non-
singularity):

2 3 1 2 3 2 3 2 3 1 2 3
1 2 3 1 2 3 x1 1 2 3 6
6 7 6 7 6 7 6 7 6 7
4 2 3 2 5 4 2 3 2 5 4 x2 5 = 4 2 3 2 5 4 1 5
3 3 2 3 3 2 x3 3 3 2 0

2 3 2 3 2 3 1 2 3
1 0 0 x1 1 2 3 6
6 7 6 7 6 7 6 7
4 0 1 0 5 4 x2 5 = 4 2 3 2 5 4 1 5 .
0 0 1 x3 3 3 2 0

Andrea Tarelli - SBFA, UCSC 33


A FIRST EXAMPLE 8

2 3 2 3 2 3 2 3
x 0 5 5 6 1
6 1 7 6 7 6 7 6 7
6 7 6 7 6 7 6 7
6 7 1 6 7 6 7 6 19 7
6 x 7 = 6 2 7 4 7 6 1 7 = 6 7
6 2 7
6 7 56
6
7
7
6
6
7
7
6
6
5 7
7
4 5 4 5 4 5 4 5
x3 3 3 1 0 21
5

Andrea Tarelli - SBFA, UCSC 34


RESOLUTION METHOD FOR LINEAR SYSTEMS 1

Consider the following linear system of m equations and n unknowns:

A
|{z} x
|{z} = b
|{z}
m n n 1 m 1

Establish rst whether there are solution(s):

I Evaluate rank (A) and rank (A jb ):

I If rank (A) = rank (A jb ) = r, the system admits solutions:

I if r = n, the solution is unique

I if r < n, there are 1n r solutions

Andrea Tarelli - SBFA, UCSC 35


RESOLUTION METHOD FOR LINEAR SYSTEMS 2

Resolution scheme (particularly useful when r < n and/or r < m):

I Within the matrix A, identify a NON-SINGULAR square submatrix Ae of


dimension r r.

I Discard the m r rows of A not belonging to Ae. Doing so, you discard
m r redundant equations.

I Keep on the left-hand-side of the remaining equations of the system all


terms belonging to Ae, while bring to the right-hand-side all terms not belonging
to Ae. The n r unknowns brought to the right-hand-side become free parameters.

I Solve the resulting r r system. When r < n, the 1n r solutions will be


a function of the n r unknowns that have been brought to the right-hand-side.

Andrea Tarelli - SBFA, UCSC 36


A SECOND EXAMPLE 1

I Consider the following linear system in the unknowns x1, x2, x3 and x4 :

8
>
> x1 x2 + 2 x3 x4 = 1
>
>
>
>
<
2 x1 x2 x3 + 2 x4 = 0
>
>
>
>
>
>
: x1 + 2 x2 + 2 x3 + x4 = 1

I Let's express it as follows:

2 3 2 3 2 3 2 3 2 3
1 1 2 1 1
6 7 6 7 6 7 6 7 6 7
x1 4 2 5 + x2 4 1 5 + x3 4 1 5 + x4 4 2 5 = 4 0 5 .
1 2 2 1 1

Andrea Tarelli - SBFA, UCSC 37


A SECOND EXAMPLE 2

I The rst three vectors are independent:

02 31
1 1 2
B6 7C
det @4 2 1 1 5A = 9
1 2 2

Andrea Tarelli - SBFA, UCSC 38


A SECOND EXAMPLE 3

I If the fourth vector is brought on the right-hand side,

2 3 2 3 2 3 2 3 2 3
1 1 2 1 1
6 7 6 7 6 7 6 7 6 7
x1 4 2 5 + x2 4 1 5 + x3 4 1 5 = 4 0 5 x4 4 2 5 ,
1 2 2 1 1

the solution (with one degree of freedom) can be written as

2 3 2 3 1 02 3 2 31 2 1 5x 3
x 1 1 2 1 1 3 4
6 1 7 6 7 B6 7 6 7C 6 3 7
6 7 6 7 B6 7 6 7C 6 7
6 7 6 7 B6 7 6 7C 6 2 16 x 7
6 x 7=6 2 1 7 B
1 7 B6 0 7
6 6
x4 6 2 7C = 6 7 .
6 2 7 6 7 7C 6 9 9 4 7
6 7 6 7 B6 7 6 7C 6 7
4 5 4 5 @4 5 4 5A 4 5
x3 1 2 2 1 1 4x + 4
9 4 9

Andrea Tarelli - SBFA, UCSC 39


A THIRD EXAMPLE 1

I Consider the following linear system in the unknowns x1, x2, and x3 :

8
>
> 2 x1 + 7 x2 + x3 = 15
>
>
>
>
>
>
>
>
< 5 x1 + 6 x2 + 8 x3 = 3
>
>
>
> 9 x2 = 27
>
>
>
>
>
>
:
7 x1 + 6 x2 + 7 x3 = 3

I Let's state it as follows:


2 3 2 3 2 3 2 3
2 7 1 15
6 5 7 6 6 7 6 8 7 6 3 7
6 7 6 7 6 7 6 7
x1 6 7 + x2 6 7 + x3 6 7 = 6 7 .
4 0 5 4 9 5 4 0 5 4 27 5
7 6 7 3

Andrea Tarelli - SBFA, UCSC 40


A THIRD EXAMPLE 2

I The rst three vectors are independent because


02 31
2 7 1
B6 7C
det @4 5 6 8 5A = 99 .
0 9 0

I The last vector linearly depends on the rst three vectors as


02 31
2 7 1 15
B6 5 6 8 3 7 C
B6 7C
det B6 7C = 0 .
@4 0 9 0 27 5A
7 6 7 3

I Hence, the system admits solution.

Andrea Tarelli - SBFA, UCSC 41


A THIRD EXAMPLE 3

I We can work out x1, x2, and x3 by focusing on the rst three equations
(given non-singularity):

2 3 2 3 2 3 2 3
2 7 1 15
6 7 6 7 6 7 6 7
x1 4 5 5 + x2 4 6 5 + x3 4 8 5 = 4 3 5
0 9 0 27

2 3 23 12 3 2 3
x1 2 7 1 15 3
6 7 6 7 6 7 6 7
4 x2 5 = 4 5 6 8 5 4 3 5 = 4 3 5 .
x3 0 9 0 27 0

I The last equation is met by construction:

x1 7 + x2 6 + x3 7 = 21 + 18 + 0 = 3 .

Andrea Tarelli - SBFA, UCSC 42

You might also like