LACM March13

Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

Geometrical interpretation of addition and mul-

tiplication. Given any two vectors ~ u, ~v in R2


or R3. In a two or three dimensional coordi-
nate system, locating points P and Q such that
−→ −→
OP = ~
u, OQ= ~v .

From P a directed line segment P Q0 is drawn


−→ −→ −→
0
along the direction of OQ such that P Q =OQ.
−→
Then OQ0= ~ u +~v representing the sum of ~
u and
~v . When ~u and ~v are not parallel to a straight
line, OP Q0Q forms a parallelogram.

For a nonzero number α, when α > 0, the


−→
directed line segment α OP is in the same di-
−→
rection as OP , its length is α times the length
−→
of OP . When α < 0, the directed line segment
−→ −→
α OP is in the opposite direction as OP , its
−→
length is |α| times the length of OP .

1
Remark: 1) If ~ y ∈ R3 are linearly indepen-
x, ~
dent, then Span(~ x, ~y ) represents a plane in R3,
which is determined by the points:(0, 0, 0),
(x1, x2, x3) and (y1, y2, y3).

2) Subspace Span(~ x) represents a line through


the origin and the point (x1, x2, x3).

3) If ~
x, ~y and ~ z are linearly independent in R3,
Span(~ x, ~ z ) form the whose space R3.
y, ~

4) A line or a plane without passing through


the origin is not a subspace.

2
Def (basis). The vectors ~v1, ~v2, · · · , ~vn form a
basis for a vector space V if and only if
(i) ~v1, ~v2, · · · , ~vn are linearly independent;
(ii) ~v1, ~v2, · · · , ~vn span V , or every element ~v ∈
V can be written as a linear combination of
~v1, ~v2, · · · , ~vn, i.e. ∃ α1, · · · , αn such that

~v = α1~v1 + · · · + αn~vn.

Example. ~e1 = (1, 0, 0)T , ~e2 = (0, 1, 0)T , ~e3 =


(0, 0, 1)T are a basis of R3, called the standard
basis. R3 has many bases.

Theorem If {~v1, · · · , ~vn} and {~


u1, · · · , ~
um} are
both bases for a vector space V , then n = m,
which is called the dimension of V .

3
Def. (Dimension). Let V be a vector space.
If V has a basis consisting of n vectors, we
say that V has dimension n. The subspace
{~0} of V is said to have dimension 0. V is
said to be finite-dimensional if there is a finite
set of vectors that span V ; otherwise, infinite
dimensional.

Def. (coordinate, coordinate vector). Let E =


[~v1, · · · , ~vn] be an ordered basis of a vector space
V . For any vector ~v ∈ V , ~v can be written in
the form ~v = c1~v1 + · · · + cn~vn, where ci are
scalars. Then vector ~c = (c1, · · · , cn)T in Rn is
called the coordinate vector of ~v with respect
to the ordered basis E, denoted by [~v ]E . The
c0is are called the coordinates of ~v relative to E

4
3.3 Linear transformations in R3

Def. A mapping L from a vector space V into


a vector space W is said to be a linear trans-
formation if for all ~v1, ~v2 ∈ V and for all scalars
α and β,

L(α~v1 + β~v2) = αL(~v1) + βL(~v2),


or equivalently, if for all ~v1, ~v2 ∈ V and for all
scalar α,

L(~v1 + ~v2) = L(~v1) + L(~v2), L(α~v1) = α~v1.


This is denoted by L : V → W . When V = W ,
L is referred to as a linear operator on V .

5
Ordinary linear operators: a stretching or shrink-
ing by a factor, projections, reflections and ro-
tations.

Example 1: L(~ x) = 3~
x; L(~ 1~
x) = 2 x;
Example 2: L(~ x ∈ R2, a projection
x) = x1~e1, ~
onto the x1 axis.
Example 3: in R2 L(~ x) = (x1, −x2)T , a reflec-
tion about the x1 axis.
Example 4:L : R2 → R2, L(~ x) = (x1 cos θ −
x2 sin θ, x1 sin θ +x2 cos θ)T . Rotation by an an-
gle θ in the counterclockwise.
L(~x) = (x2, −x1)T is a rotation by π/2 in the
clockwise direction.

6
Def. A linear transformation is called one-to-
one, if and only if L(~v1) = L(~v2) implies ~v1 =
~v2;
A linear transformation is said to map V onto
W if and only if L(V ) = W

Property: 1) Let L1 : U → V , and L2 : V → W


be linear transformations and let L = L2 ◦ L1
defined by L(~u) = L2(L1(~ u)), ∀~
u ∈ U . Then
L : U → W is a linear transformation.

2) Let L : V → V be a linear operator, define


Ln(n ≥ 1) recursively by L1 = L, Lk+1(~v ) =
L(Lk (~v )). Then Ln is a linear operator on V .

Composition and powers of a linear operator.

7
Def. L : V → W . The kernel of L, denoted by
Ker(L), is defined by Ker(L) = {~v ∈ V |L(~v ) =
~
0w }.

Def. L : V → W . S ⊂ V is a subset. The


image of S, denoted by L(S), is defined by
L(S) = {w ~ ∈ W |w~ = L(~v ), f or some ~v ∈ S}.
L(V ) is called the range of L.

Theorem. L : V → W is a transformation.
Ker(L) is a subspace of V , L(S) is a subspace
of W .

8
3.4 Inner product space and orthogonality

Def. Let ~
x and ~y be vectors in R3 or R2. The
inner product of ~
x and ~
y is defined as
(
x1y1 + x2y2 + x3y3, in R3
(~
x, ~ xT ~
y) = ~ y=
x1y1 + x2y2, in R2.
With this product, R3 or R2 forms an inner
product space.

The inner product has the properties:


I. (~
x, ~
x) ≥ 0 with equality if and only if ~ x=~ 0;
II. (~
x, ~
y ) = (~
y, ~
x) for all ~
x and ~y in V ;
III. (α~x + β~y, ~
z ) = α(~
x, ~
z ) + β(~
y, ~
z ), for all ~
x, ~y
and ~ z in V and all scalars α, β.

Def. In terms of the inner product Euclidean


length of a vector ~
x (norm) in R 3 or R2 is de-
q q
xk = x1 + x2 + x3 or x2
fined as k~ 2 2 2
1 + x 2.
2

9
Def. The angle between ~ x and ~ y , ∠(~
x, ~
y ), is
defined the angle between the line segments.

Def. The distance between ~


x and ~
y is defined
to be the number k~
x−~y k.

Theorem. If ~ x and ~
y are two nonzero vectors in
either R2 and R3, and θ = ∠(~ x, ~
y ) is the angle
between them, then (~ x, ~
y ) = k~
xk · k~y k · cos θ.
If the unit vector ~u = ~ x/k~
xk and ~v = ~ y /k~yk

are defined as the direction vector of ~


x and ~
y
respectively, then cos θ = (~
u, ~v ).

The proof is seen on Page 212.

10
Cauchy-Schwarz inequality. If ~ x and ~
y are vec-
tors in R2 or R3, then |(~
x, ~
y )| ≤ k~
xkk~
y k.

Def.(orthogonality) The vectors ~x and ~ y in R2


or R3 are said to be orthogonal if (~
x, ~
y ) = 0.

For example, (a) ~ 0 is orthogonal to every vec-


tor in R2; (b) (2, −3, 1)T and (1, 1, 1) in R3.

11
Def.(scalar and vector projections). For any
nonzero vectors ~ y in R2 or R3, the scalar
x and ~
projection of ~
x onto ~
y is
(~
x, ~
y) xT ~
~ y
α= = .
k~yk k~yk
The vector projection of ~
x onto ~
y is
~
y (~
x, ~
y)
p
~ = α~
u=α = ~
y.
k~
yk k~yk

By the definitions of projections we can solve


problems in geometry. See Page 214-215.

12
Def. (orthogonal subspaces). Two subspaces
X and Y of R3 are said to be orthogonal if
(~
x, ~ xT ~
y) = ~ y = 0 for every ~x ∈ X and every
~
y ∈ Y , which is denoted by X ⊥ Y .

Example 1. In R3, X = Span(~e1), Y = Span(~e2),


then X ⊥ Y .

Example 2. In R3, X = Span(~e1, ~


x2), Y =
Span(~e3), then X ⊥ Y .

13
Def. Let Y be s subspace of R3. The set of
all vectors in R3 that are orthogonal to every
vector in Y will be denoted Y ⊥,

Y ⊥ = {~
x ∈ R3|(~
x, ~ xT ~
y) = ~ y = 0, ∀~
y ∈ Y }.
The set Y ⊥ is called the orthogonal comple-
ment of Y .

Example: In R3, X = Span(~e1), Y = Span(~e2),


then

X ⊥ = Span(~e2, ~e3), Y ⊥ = Span(~e1, ~e3).

Remarks: 1) If X and Y are orthogonal sub-


spaces of R3, then X ∩ Y = {~
0}.
2) If Y is subspace of R3, then Y ⊥ is also a
subspace of R3, and DimY + DimY ⊥ = 3.

14
Def. If U and V are subspaces of a vector space
W and each w ~ ∈ W can be written uniquely as
a sum ~ u + ~v , where ~
u ∈ U and ~v ∈ V , then we
say that W is a direct sun of U and V , and we
L
write W = U V .

15

You might also like