Math Homework Assignment Linear Equations

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

MATH 425-Spring 2013

HOMEWORK ASSIGNMENTS
MATH 425 Linear Algebra II, Spring 2013
LCD-undergrad 24908; LCD-grad 24909,
MWF 10:00-10:50, Thaft Hall 313
Instructor: Shmuel Friedland
Office: 715 SEO, phone: 413-2176, e:mail: [email protected],
web: http://www.math.uic.edu/friedlan
Last update April 23, 2013

HOMEWORK ASSIGNMENT 1
Assigned 1-9-13 Due 1-23-13

Do the following problems from [1]: 1.1.1 p3 : 1, 2; 1.2.3, p6 : 15, 1.3.6 p


1011 : 1,2,3.

HOMEWORK ASSIGNMENT 2
Assigned 1-24-13 Due 2-1-13

a. 3 problems from 1.4.1 from [1]


b. 4 problems from 1.4.2 from [1].
c. Let z1 , z2 , . . . , zn C. The Vandermonde matrix is given as

1 z1 z12 . . . z1n1
1 z2 z 2 . . . z n1
2
2

nn
V (z1 , . . . , zn ) := . .
.
..
.. C
.. ..

. ...
.
1 zn zn2 . . . znn1

Q
Show that det V (z1 , . . . , zn ), called the Vandermonde determinant is equal to 1i<jn (zj
zi ).
d. Let S5 be defined as (1) = 3, (2) = 5, (3) = 1, (4) = 4, (5) = 2. Find
sign().
e. A, B Fmm are called congruent if A = T BT > for some T GL(n, F). Show
1. Congruence in Fnn is an equivalence relation.
2. Show that any two congruent matrices have the same rank

HOMEWORK ASSIGNMENT 3
Assigned 1-30-13 Due 2-8-13

A. Assume that if A Fnn is a skew symmetric matrix.


1. Show that if n is odd and F has characteristic not equal to 2, i.e. 2 6= 0 in F,
then det A = 0.
2. Show that if F has characteristic not equal to 2, then A is congruentto a block

0 1
diagonal matrix B = diag(B1 , . . . , Bk ), where each block is either
1 0
or 1 1 0 matrix. Hint: Use a sequence of elementary conjugation given
by EAE > where E is an elementary matrix.
3. Show that if F has characteristics 2, then A is congruent to a block diagonal
matrix B = diag(B
 1 , . . . , Bk ), where each block is either or 1 1 0 matrix,
0 1
1 1 identity or
. (Note that 1 = 1 in F.)
1 0
4. Given an example of n n skew symmetric matrix, for a field with with
charactersitic 2, whose determinant is nonzero for each n N.
5. Assume that F = R. Then det A 0.
B. Problems 2, 4 - 7 on page 20 [1].

HOMEWORK ASSIGNMENT 4
Assigned 2-8-13 Due 2-15-13

A. Consider the equation the equation z m 1 = 0.


1. Find all the roots of for m = 3, 4 explicitly, i.e. using roots if necessary. List
the primitive roots in each case.
2

1k

2. Let = e m , where k is a positive integer. Show that is a primitive root


of m if and only if k and m coprime.
3. Suppose that m = pa11 . . . pal l is the decomposition of m to the products of
primes 1 < p1 < p2 < . . . < pl , where each ai is positive integer. Show that
the number of primitive roots of m is the Euler function
(m) = pa11 1 (p1 1) . . . pal l 1 (pl 1).
(First try to prove the case l = 1.)
Problem 8 on p20 in [1].
Problems 1-8 p22-23 in [1].

HOMEWORK ASSIGNMENT 5
Assigned 2-13-13 Due 2-22-13

1. PROBLEMS on p25-26 in [1] that were not done in the class by me.
2. Problems on page 28 in [1] that were not done in the class by me.
2

HOMEWORK ASSIGNMENT 6
Assigned 2-20-13 Due 3-1-13

Do the following problems


1. Let u = (1, 1, 1, 1)> , v = (2, 0, 2, 1)> . Find
(a) The cosine of the angle between u and v.
(b) The scalar and the vector projection of v on u.
(c) A basis to the orthogonal complement of U := span(u, v).
(d) The projection of the vector (1, 1, 0, 0)> on U and U .
2. Let A R43 . Assume that the vector (1, 1, 1, 1)> is a vector in the column
space of A. Is it possible that a vector (2, 0, 2, 1)> is in the null space of A> ?
If yes give an example of such a matrix. If not, justify why.
3. Consider the overdetermined system
x1 + x2
x1 + x2
x2
x1

+
+
+
+

x3
x3
x3
x3

=
=
=
=

4
0
1
2

(a) Is this system solvable?


(b) Find the least squares solution of this system.
(c) Find the projection of (4, 0, 1, 2)> on the column space of the coefficient
matrix A R43 of this system.
4. See pages 128 - 130 in my notes of Math 320, 2012
http://homepages.math.uic.edu/friedlan/math320lecS12.pdf
Let (1, 0), (0, 1), (1, 3), (2, 9) be four points in the plane (x, y) Find
(a) The best least squares fit by a linear function y = ax + b.
(b) The best least squares by a quadratic polynomial y = ax2 + bx + c.
(c) Explain briefly why there exist a unique cubic polynomial y = ax3 +bx2 +
cx + d passing through these four points.
5. Let a t1 < t2 < . . . < tn b be n points in the P
interval [a, b]. For any two
continuous functions f, g C[a, b] define hf, gi := ni=1 f (ti )g(ti ). Let Pm be
the vector space of all polynomials of degree at most m 1.
(a) Show that for m n h , i is an inner product on Pm .
(b) Is h , i an inner product on Pn+1 ? Justify!
R1
6. For the inner product hf, gi := 1 f (x)g(x)dx on C[1, 1] Find the cosine of
the angle between f (x) = 1 and g(x) = ex .

HOMEWORK ASSIGNMENT 7
Assigned 3-5-13 Due 3-15-12

[1]: 2.3 page 3233, Problems: 9(a,b,c), (special orthogonal means determinant
one), 10a, 12.
[4]: 6.4 p363-365, Problems: 4(a-f); 5(a,b,c,f),6,10,12,14.

HOMEWORK ASSIGNMENT 8
Assigned 3-5-13 Due 3-22-13

I. Assume that A a real symmetric matrix. Denote by + (A) be the number of


positive eigenvalues, , 0 (A) the number of zero eigenvalues, (A) be the number
of negative eigenvalues. Denote (A) := (+ (A), 0 (A), (A)). Show.
1. Show that i+ (A) is the dimension of the unique subspace U Rn such that
x> Ax > 0 for each nonzero x in U. (Hint: Use the convoy principle.)
2. Show that i (A) is the dimension of the unique subspace U Rn such that
x> Ax < 0 for each nonzero x in U.
3. rank A = + (A) + (A).
4. A symmetric B Rnn is called congruent to A if B = QAQ> for some
invertible matrix Q. Show that two symmetric matrices are congruent if and
only if (A) = (B). (This result is called the Sylvester law of inertia. This is
the content of Problems 6 and 7 in [1, p40] for the hermitian case.))
II. Let A = [apq ] Cnn be a hermitian matrix. Rearrange the diagonal entries of
A, a11 , a22 , . . . , ann in a nonincreasing way: 1 2 . . . n . Show
1. 1 (A) 1 , n (A) n . Hint: Use the maximum and minimum characterization of 1 (A), n (A).
P
P
2. Show that ki=1 i ni=1 i (A) for k = 1, . . . , n. What happens for k = n?
Hint: Use the convoy principle.
qP
n
2
3. Show that |j (A)|
p=q=1 |apq | for each j = 1, . . . , n. For which kind of
matrices and for which j we have equality in this inequality?
[1]: 2.5 page 3940, Problems: 1,3.

HOMEWORK ASSIGNMENT 9
Assigned 3-20-13 Due 4-513

A.
1. Let A, B Hn . Assume that B > 0. Show that AB is diagonable and has real
eigenvalues. Which part of this statement remains correct if (a) B 0, (b) B
is just hermitian.

2. Let A Cnn and view A as a linear transformation T given by x 7 Ax,


from Cn to itself. A is called symmetrizable if there exists B > 0 such that T
is selfadjoint with respect to the inner product on Cn hx, yi := y Bx. Show
that A is symmetrizable if and only if A is similar to a real diagonal matrix.
B. Problems 3,4,9 [1, p 51], (In 9 you can assume that A is a normal matrix.)
C. Problems 1-6,8 [4, p 380-382].

10

HOMEWORK ASSIGNMENT 10
Assigned 4-5-13 Due 4-1213

1. Let A Cnn and assume that the eigenvalues of A 1 , . . . , n are arranged in


the order |1 | . . . |n |. Show that
1.

k
X

|i |

i=1

k
X

i (A)

for k = 1, . . . , n.

(10.1)

i=1

Hint: First use Schurs theorem to assume that A is upper triangular. Then
use Theorem 2.70 and Ky Fan maximal characterization.
2. Show that |i | = i (A) for i [n] if and only if A is normal.
2. Let b1 , . . . , bn Cm . For a fixed subspace WP Cm consider the minimal approximation problem f (W) := minw1 ,...,wn W nj=1 kbi wi k2 . Now take
?
mn . Then f (W? ) =
min
PmWGr(k,Cm2) f (W) = f (W ?). Let A = [b1 b2 . . . bm ] C
i=k+1 i (A) . Moreover W is the subspace spanned by the first k left singular
vectors of A.
3. Problem 3 page 60 [1] (end of Section 2.10.)
4. Let A, B Cmn . Show that

Pk

i=1 i (A

+ B)

Pk

i=1 i (A)

Pk

i=1 i (B).

5. Let A Cmn and B Cnl . Show that 1 (AB) 1 (A)1 (B). Give
necessary and sufficient conditions for the equality 1 (AB) = 1 (A)1 (B) when
0 < 1 (A)1 (B).

11

HOMEWORK ASSIGNMENT 11
Assigned 4-12 Due 4-6-19-13

Problems 1, 5, page 64 in [1],


Problems 1c, 2, 4 page 70 in [1]. (A is called noderogatory if the minimal polynomial
of A equal to the characteristic polynomial of A.)
Additional problems:
1. Let A Fnn , B Fmm . Assume that f, g F[t] are
polynomi the minimal

A 0
als of A, B respectively. Form C = diag(A, B) :=
. Let h be gcd,
0 B
the greatest common divisor, of f and g, which is assumed to be monic. Show
that fhg is the minimal polynomial of C.
5

2. Find the characteristic and the minimal polynomials of the following matrices

2 5 0 0 0

0 2 0 0 0
2 2 5

3 7 15 , 0 0 4 2 0 .

0 0 3 5 0
1 2 4
0 0 0 0 7
3. Show that two similar matrices have the same minimal polynomial.

1 1 0
2 0 0
4. Let 0 2 0 , 0 1 0 have different characteristic polynomials, but
0 0 1
0 0 2
the same minimal polynomial.
5. Show that the square matrices A and A> have the same minimal polynomial.
6. Let A Fnn and assume that f (t) F[t] is an irreducible monic polynomial
for which f (A) = 0. Show that f is the minimal polynomial of A.

12

HOMEWORK ASSIGNMENT 12
Assigned 4-16-12 Due 4-26-12

Problems 13, 4b, 3.4, page 75. (Weyr characteristic is defined in Definition 3.28
on p73 of [1]).
Problem 1. Suppose that the characteristic and the minimal polynomial of a linear
operator T are as below. Find all possible Jordan canonical forms of T .
1. f (t) = (t 2)4 (t 5)3 , g(t) = (t 2)4 (t 5)3 ,
2. f (t) = (t 2)4 (t 5)3 , g(t) = (t 2)2 (t 5)3 ,
3. f (t) = (t 2)4 (t 5)3 , g(t) = (t 2)(t 5).
Problem 2. Find all possible Jordan forms for all 8 8 matrices having x2 (x 1)3
as a minimal polynomial.
Problem 3.
a. Show that if the characteristic
polynomial of A Fnn splits to linear factors
Qn
in F, i.e. det(zI A) = j=1 (z j ), then A is similar to A> .
b. Try to prove that for any A Fnn , A is similar to A> . (Hint: Let F1 be
a finite extension of F, where det(zI A) splits to linear factors. Then by part a,
show that A and A> are similar over F1 . So there eixsts a matrix X Fnn
such
1
>
that AX XA = 0 and det X 6= 0. Deduce now that one can choose X in Fnn
such that det X 6= 0.)
Problem 4. Recall that a matrix A Fnn is called diagonable if A is similar to a
diagonal matrix over F. A linear operator T : V V is called diagonable if there
is a basis in V such that T is represented by a diagonal matrix. Show

1. A is diagonable over F if and only if det(zI A) splits to linear factors over


F, and the minimal characterstic polynomial of A has simple roots.
2. A is diagonable if the roots of det(zI A) are in F whenever (T I)m v = 0,
for some positive integer m, then (T I)v = 0.
3. Suppose that the linear operator T is a projection, i.e. T 2 = T . Then T is
diagonable.
4. Assume that T, Q Fnn are projections. Then T and Q are similar if and
only if rank T = rank Q.
5. Let n > 1 be an integer, and consider the matrices A = 11> Fnn , 1 =
(1, . . . , 1)> Fn and the diagonal matrix diag(n, 0, . . . , 0) Fnn . Then A
and B are similar if and only if the characteristics of F does not divide n.

13

HOMEWORK ASSIGNMENT 13
Assigned 4-23-13 Due 5-3-13

A. Problem 1 on page 87 in [1]. (The system xl = Al xl1 is homogeneous.)


B. For the following matrices find the components of A as defined in Theorem 4.1
on page 81 in [1], find A100 and eAt using the components of A.

1.


1 1
,
1 3

0 2 1
2. 0 1 1 ,
0 2 2

2 1 1 0
0 5 6 1
3.
0 3 4 1
0 0 0 1

C. A Rnn is called a stochastic matrix if all entries of A are nonnegative and the
sum of each row is 1. (I.e. each row of A is a probability vector.) Show
1. for each positive integer k Ak is a stochastic matrix.
2. A is power bounded. (See Definition 4.5 in [1].)
3. 1 is an eigenvalue of A.
4. Each Jordan block corresponding to eigenvalue of 1 is of order 1.
5. Each eigenvalue of A satisfies || 1.
6. A is power convergent iff and only if each eigenvalue of A different from 1,
|| < 1. (See Definition 4.5 in [1].)

References
[1] S. Friedland, Linear Algebra II, Lectures Notes, Spring
http://www2.math.uic.edu/friedlan/lectnotesM425S13.pdf

2013,

[2] G.H. Golub and C.F. Van Loan. Matrix Computation, John Hopkins Univ.
Press, 3rd Ed., Baltimore, 1996.
[3] R.J. Horn and C.R. Johnson, Matrix Analysis, Cambridge University Press,
2ed, 2013.
[4] S.J. Leon, Linear Algebra with Applications, Prentice Hall, 6th Edition,
2002.
[5] S. Lipschutz and M. Lipson, Linear Algebra, Fourth Edition, Schaums
Outlines, McGraw-Hill, 2009.

You might also like