Lectureslides#5 Chapter#9: R Andom P R Ocesses

Download as pdf or txt
Download as pdf or txt
You are on page 1of 35

LectureSlides#5

Chapter#9

R andom P r ocesses
O ver view
1. Definition of a Random Process
2. Specifying a Random Process
3. Discrete-Time Processes: Sum Process,
4. Gaussian Random Processes
W hy R andom Pr ocess?
The outcome of a random experiment is a function of time or space
Time: Audio signals: voltage measured continuously by a
microphone, usually in [−5, + 5] volts.
Space: Image: collection of RGB triples that correspond to pixels.
Time and Space: Digital communication: message decoded by a
sequence of packets from many stations.
The observations are viewed as numerical quantities that are
generated in time or space
We need a family of random variables indexed by the time or space
variable.
Time: Audio signal at an instant
Space: RGB values at a pixel
Time and Space: Packet received at a certain time from a specific station
9.1 Definition of a R andom Pr ocess
Suppose that to every outcome ζ ∈ S, we assign a function of time:

X (t, ζ ) t ∈I (index set)

The function X (t, ζ ) of t, for ζ fixed, is called a realization, sample


path, or sample function of the random process.
The outcome of the random experiment is producing an entire
function of time as shown in Figure 9.1

RANDOM PROCESS refers to the family (or ensemble) indexed by the


parameter t, { X (t, ζ ), t ∈ I }
STOCHASTIC PROCESS is the statistics term for random process.
Discrete-time stochastic process if the index set I is a countable set, so often
we use X n to denote the random process with time index n.
Continuous-time stochastic process if I is continuous. (the real line)
Random Process
E x#9.2 R andom Sinusoids
Random Amplitudes (Scaled Sinusoids) Let ζ be selected at random from the
interval [−1, 1]. Define the continuous-time random process X (t, ζ )
by
X (t, ζ ) = ζ cos (2πt) −∞ < t < ∞
As shown in Figure 9.2(a) , the realizations of the random process
X (t, ζ ) are sinusoids with amplitude ζ .
Random Phases (Shifted Sinusoids) Let ζ be uniform in (−π, π) and let

Y (t, ζ ) = cos(2πt + ζ ) −∞ < t < ∞

As shown in Figure 9.2(b) , the realizations of the random process Y


(t, ζ ) are phase-shifted versions of cos(2πt).
E x 9.2 R andom Sinusoids
9.2 Specifying A R andom Pr ocess
There are many questions for random processes that cannot be
answered with a single time instant.
We may be interested in the temperature at a given locale at two
different times, such that

P [x1 < X (t1 ) ≤ x1 , x2 < X (t2 ) ≤ x2 ] join t p r ob a b ilit y

The speech compression system in a cellular phone predicts the next


sample based on the previous k samples:

P [a < X (tk + 1 ) ≤ b | X (t1 ) = x1 , . . . , X (tk ) = xk ] con d it ion a l p r .

Usually, general description of a random process is given by joint


probabilities of vectors of sequential samples.
9.2.1 J oint Distr ibutions of Tim e Sam ples
For compact notation, let X 1 , . . . , X k be the RVs obtained by
sampling the random process X (t, ζ ) at the times t1 , . . . , tk :
{ X 1 , . . . , X k } = { X (t1 , ζ ), . . . , X (tk , ζ )}

The joint behaviour of the random process at the k time instants is


specified by the joint cumulative distribution of the vector random
variable, X = (X 1 , . . . , X k ).
In other words, for any k and any choice of t1 , . . . , tk , a stochastic
process is specified by the kth-order joint cdf:
FX (x ) = P [X (t1 ) ≤ x1 , . . . , X (tk ) ≤ xk ]

If the stochastic process is continuous-valued or discrete-values, use


a joint pdf or a joint pmf instead of cdf:
E xamples
9.5 Bernoulli RVs Let X n be a sequence of iid Bernoulli RVs with p = 1/2.
The joint pmf for any k time samples is then

9.6 iid Gaussian RVs Let X n be a sequence of iid Gaussian RVs with zero
mean and variance σ 2 . The joint pdf for any k time samples is then

1 .
.
exp

=
exp −

1 !
= 1 exp − 2σ2
E x 9.8 Filter ed Noisy Signal
Let X j = µ + N j be a sequence of iid observations of a signal voltage µ
corrupted by zero-mean Gaussian noise N j with variance σ 2 . Consider
the signal that results from averaging the sequence of observations:

It can be proved that M n is a Gaussian RV with mean µ and


variance σn . As n increases, the sample mean M n approaches to µ.
2
Moments of Random Process
For compact notation, let X 1 , . . . , X k be the RVs obtained by
sampling the random process X (t, ζ ) at the times t1 , . . . , tk :
{ X 1 , . . . , X k } = { X (t1 , ζ ), . . . , X (tk , ζ )}

The moments of time samples of a random process can be used to partially


specify the random process
All moments are function of ‘time’

Mean , Variance : First-order statistic ( use the sample at single time instant)
Auto-covariance, Auto-Correlation: 2nd order statistic ( use the samples at two
different time instants)
9.2.2 M ean, Autocor r elation

Note that those are deterministic function of time variable t. The variance
indicates the spread of X (t) at different time instants.
The autocorrelation R X (t1 , t2 ) is defined by the joint moments:

Note that R X (t, t) = E [X 2 (t)].


9.2.2 Auto-covar iance
The autocovariance CX (t1 , t2 ) of X (t) is defined as the covariance of
X (t1 ) and X (t2 ):

CX (t1 , t2 ) = E[{X(t1 ) − mX (t1 )}{X(t2 ) − mX (t2 )}]


= RX (t1 , t2 ) − mX (t1 )mX (t2 )

Note that
CX (t, t) = E [{ X (t) − mX (t)} 2] = VAR [X (t)]
The correlation coefficient of X (t) is defined as that of different times:
Discr ete Indexing
For discrete-time RVs, we can use t(n) = nT for the fixed unit
interval length T > 0. Let X n = X (nT ), then

mX (n) = E[Xn ]
VAR[X(n)] = E[(Xn − mX (n))2 ]
RX (n1 , n2 ) = E[X(n1 )X(n2 )]
CX (n1 , n2 ) = E[{X(n1 ) − mX (n1 )}{X(n2 ) − mX (n2 )}]
= RX (n1 , n2 ) − mX (n1 )mX (n2 )
E x 9.9 Sinusoid with R andom Am plitude
Let X (t) = A cos 2πt, where A is a RV.
Find mean, autocorrelation, and autocovariance of X (t) in terms of the
statistical moments of A.
E x 9.10 Sinusoid with R andom Phase
Let X (t) = cos(ωt + Θ), where Θ is uniformly distributed in (−π, π).
Find mean, autocorrelation, and autocovariance of X (t) in terms of ω.
9.2.3 M ultiple R andom Pr ocesses
Joint CDF A joint cdf (or pdf) is useful to describe the joint behaviour of two
different random processes at different times, X (t1 ) and Y (t2 ):

FX ( t 1 ) ,Y ( t 2 ) (x, y) = P [X (t1 ) ≤ x, Y (t2 ) ≤ y]

Independent Random Processes Suppose that two random vectors


X = (X (t1 ), . . . , X (tk )) and Y = (Y (t′1), . . . , Y (t′j )) are constructed
from two different random processes X (t) and Y (t).

X (t) and Y (t) are said to be independent random processes if

is satisfied for any arbitrary choices of (k, j ), (t1 , . . . , tk ), and


∀(t ′1 , . . . , t ′j ).
9.2.3 M ultiple R andom Pr ocesses
The cross-correlation R X ,Y (t1 , t2 ) is defined by

R X ,Y (t1 , t2 ) = E [X (t1 )Y (t2 )]

The process X (t) and Y (t) are orthogonal random processes if

R X ,Y (t1 , t2 ) = 0 ∀(t1 , t2 )

The cross-covariance CX ,Y (t1 , t2 ) is defined by

CX,Y (t1 , t2 ) = E[{X(t1 ) − mX (t1 )}{Y (t2 ) − mY (t2 )}]


= RX,Y (t1 , t2 ) − mX (t1 )mY (t2 )

The process X (t) and Y (t) are uncorrelated random processes if

CX ,Y (t1 , t2 ) = 0 ∀(t1 , t2 )
E x 9.11 C r oss-covar iance of sin and cos
Let X (t) = cos(ωt + Θ) and Y (t) = sin(ωt + Θ), where Θ is uniformly
distributed in [−π, π].
Find the cross-covariance of X (t) and Y (t).
E xample 9.12 Signal Plus Noise
Suppose Y (t) consists of a desired signal X (t) plus noise N (t).

Y (t) = X (t) + N (t)

Find the cross-correlation between the observed and the desired signal
assuming that X (t) and N (t) are independent random processes.
9.3.1 IID R andom Pr ocess
Let X n be a discrete-time RP consisting of a sequence of iid RVs
with common cdf FX (x), mean m, and variance σ 2 . The sequence X
is called the iid random process, with its joint cdf:

FX (x1 , x2 , . . . , xk ) = P [X 1 ≤ x1 , . . . , X k ≤ xk ] = FX (x1 ) ···FX (xk )

where, for simplicity, X k denotes the sample at time instant nk .


If { X k } are discrete, then

pX (x1 , x2 , . . . , xk ) = pX (x1 )pX (x2 ) ···pX (xk )

If { X k } are continuous-valued, then

f X (x1 , x2 , . . . , xk ) = f X (x1 )f X (x2 ) ···f X (xk )


M ean and Covar iance of IID R P
Mean of an iid process is constant mX (n) = E [X n ] = m, ∀n
Auto-covariance of an iid process

Auto-correlation of an iid process

𝑅𝑋 𝑛1 , 𝑛2 = 𝐶𝑋 𝑛1 , 𝑛2 + 𝑚2
E x 9.13 Ber noulli R andom Pr ocess
Let I n be a sequence of independent Bernoulli random variables as
shown in Figure 9.4(a).
Then I n is an iid random process taking values from { 0, 1} . It has
mean and variance

The probability that the first four bits are 1001 is

Similarly, the probability that the second is 0 and the seventh is 1 is


p(1 − p).
Its sum process is a binomial random process shown in Figure 9.4(b).
Example#9.14, Random Step Process
An up-down counter is driven by +1 or -1 pulses. Let the input to the
counter be given by 𝐷𝑛 = 2𝐼𝑛 − 1 , where In is the Bernoulli random
process, then
9.3.2 Indep. incr ements and M ar kov ...
Independent increments A random process X (t) is said to have
independent increments if the increments in disjoint intervals are
independent RVs, i.e.,

X (t2 ) − X (t1 ), X (t3 ) − X (t2 ), . . . , X (tk ) − X (tk −1 ), ∀k

are independent RVs.


Markov property A random process X ( t) is said to be Markov (or
Markovian) if the future of the process given the present is
independent of the past, i.e.,

CT Variable

Transition pdf
9.3.3 Sum Pr ocesses
sum process is obtained as the sum of a sequence of iid random variables:

Sn = X 1 + X 2 + ···+ X n n = 1, 2, . . .
= Sn −1 + X n (S0 = 0)

conditional independence Although Sn is clearly dependent on S1 , . . . , Sn −1 ,


when Sn −1 is known, Sn is dependent on Sn −1 only

Markov process Sn is a Markov process, since it only depends on the first


previous time sample.
Example#9.15, Binomial Counting Process
9.5.1 G aussian R andom Pr ocess
A random process X (t) is a Gaussian random process if
X = (X (t1 ), X (t2 ), . . . , X (tk )) is a Gaussian random vector for all k,
and all choices of (t1 , . . . , tk ), for both discrete-time and
continuous-time processes:

The joint pdf of Gaussian random processes are completely


specified by the mean function of the process mX (t) and by the
covariance function CX (t1 ,t2).
The linear operations such as sum, derivative, and integral, on a
Gaussian process results in another Gaussian random process.
Since many signal and noise processes can be modeled by
Gaussian accurately, along with linear operations, Gaussian
random processes are the most useful in signal processing.
E xample 9.27
iid Discrete-Time Gaussian Random Process
Let the discrete-time random process X n be a sequence of
independent Gaussian RVs with mean m and variance σ 2 .
The covariance matrix for the times n1 , . . . , nk is

The joint pdf for the vector X n = (X n 1 , . . . , X n k ) is

The value at every time instant is independent of the value at all other
time instants.
E xample 9.28
Continuous-Time Gaussian Random Process
Let X (t) be a continuous-time Gaussian random process with mean
function and covariance function given by:

You might also like