AI-ML Class-16
AI-ML Class-16
Let u be a unit vector and let v be any arbitrary vector. The orthogonal
projection of v along the vector u is defined as Pu (v ) = ⟨u, v ⟩u. If u is
u
any non-zero vector Pu (v ) = ⟨ ∥u∥ u
, v ⟩ ∥u∥ = ⟨u,v ⟩
⟨u,u⟩ u
7 8 −1
= +
6 4 2
What is the distance from vector y to span{u}
p √
Ans= ∥y − Pu (y )∥ = (−1)2 + 22 = 5
Proof.
Let {v1 , v2 , . . . , vk } be an orthogonal set of non-zero vectors. Assume that
c1 v1 + c2 v2 + . . . + ck vk = 0 , ci ∈ R.
Then ⟨(c1 v1 + c2 v2 + . . . + ck vk ), vj ⟩ = 0 for 1 ≤ j ≤ n and thus
Pk
i=1 ci ⟨vi , vj ⟩ = 0
Since ⟨vi , vj ⟩ = 0 for i ̸= j, the only term not vanishing is cj ⟨vj , vj ⟩. Since
vj ̸= 0, we have ⟨vj , vj ⟩ =
̸ 0. This implies cj = 0. As j is arbitrary, we see
that cj = 0 for all j.
= ⟨u1 , v2 ⟩ − ⟨u1 , v2 ⟩
=0
⟨v1 ,v2 ⟩
Further u2 ̸= 0. Suppose u2 = 0 then we get v2 = Pu1 (v2 ) = ⟨v1 ,v1 ⟩ .v1 .
Theorem
Let V be any inner product space. Then V has an orthonoral basis
Proof.
It is enough to produce an orthogonal basis of V . Let {v1 , v2 , . . . , vn } be a
basis of V . Let u1 = v1 . In case of R 2 we have already proved that
⟨v2 ,u1 ⟩
u2 = v2 − ⟨u1 ,u1 ⟩ .u1 and u2 ⊥ u1 . Thus ⟨u1 , u2 ⟩ = 0. Also u2 ̸= 0. For if
⟨v2 ,u1 ⟩
u2 = 0 then v2 = ⟨u 1 ,u1 ⟩
.v1 , will imply {v1 , v2 } is linearly dependent, a
⟨v3 ,u1 ⟩ ⟨v3 ,u2 ⟩
contradiction. Let u3 = v3 − ⟨u 1 ,u1 ⟩
.u1 − ⟨u2 ,u2 ⟩
.u2 . Then we see that
⟨u3 , u1 ⟩ = 0 = ⟨u3 , u2 ⟩ = 0 . Also u2 ̸= 0. For otherwise, v3 is a linear
combination of u1 and u2 , a contradiction to the independedent set.
Dr. Sabitha D’Souza September 22,2022 9 / 13
Pk−1 ⟨vk ,ui ⟩
Proceeding as above, by induction, we define uk = vk − i=1 ⟨ui ,ui ⟩ .ui .