0% found this document useful (0 votes)
47 views

EEM 306 Introduction To Communications: Department of Electrical and Electronics Engineering Anadolu University

1) The lecture discusses random processes and their properties when passed through linear time-invariant (LTI) systems. Key results shown are that the output of an LTI system with a stationary random input is also stationary, and their means, autocorrelations and power spectral densities are related. 2) Power spectral densities allow analyzing random processes in the frequency domain. It is shown that the power spectral density of a stationary process is the Fourier transform of its autocorrelation function. 3) Examples are provided to demonstrate calculating power spectral densities and autocorrelations of random processes before and after passing through LTI systems. 4) Properties of Gaussian and white random processes are discussed.

Uploaded by

HaroonRashid
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views

EEM 306 Introduction To Communications: Department of Electrical and Electronics Engineering Anadolu University

1) The lecture discusses random processes and their properties when passed through linear time-invariant (LTI) systems. Key results shown are that the output of an LTI system with a stationary random input is also stationary, and their means, autocorrelations and power spectral densities are related. 2) Power spectral densities allow analyzing random processes in the frequency domain. It is shown that the power spectral density of a stationary process is the Fourier transform of its autocorrelation function. 3) Examples are provided to demonstrate calculating power spectral densities and autocorrelations of random processes before and after passing through LTI systems. 4) Properties of Gaussian and white random processes are discussed.

Uploaded by

HaroonRashid
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

EEM 306 Introduction to Communications

Lecture 9

Department of Electrical and Electronics Engineering


Anadolu University

May 20, 2014

Lecture 9 1/22
Last Time

I Random Processes
I Analytic Description
I Statistical Description
I Mean and Autocorrelation Function
I Stationary Processes
I Strictly Stationary
I Wide-Sense Stationary (WSS)
I Cyclostationary
I Ergodic Processes
I Power and Energy
I Multiple Random Processes

Lecture 9 2/22
Random Processes and Linear Systems

Let X(t) be a stationary process.


Theorem: If a stationary process X(t) with mean mX and auto-
correlation function RX (τ ) is passed through an LTI system with
impulse response h(t), the input and output processes X(t) and
Y (t) will be jointly stationary with
Z∞
mY = mX h(t)dt
−∞

RXY (τ ) = RX (τ ) ∗ h(−τ )
RY (τ ) = RX (τ ) ∗ h(τ ) ∗ h(−τ )

Lecture 9 3/22
Random Processes in the Frequency Domain

How can frequency-domain analysis techniques be applied to


the case of LTI systems with random inputs?

Look at the input-output power spectra!


Power Spectrum of Stochastic Processes: Let X(t) denote a ran-
dom process and x(t; wi ) denote a sample function of this process.
Truncate the sample function

x(t; wi ), |t| < T /2
xT (t; wi ) =
0, otherwise

The power spectrum:

|XT (f )|2 E[|XT (f )|2 ]


 
SX (f ) = E lim = lim
T →∞ T T →∞ T

Lecture 9 4/22
Example:
Let the random process X(t) be defined by X(t) = X, where
X is a random variable uniformly distributed on [−1, 1]. Find
SX (f ).
Solution: Consider the truncated random signal
 
t
XT (t) = Xrect
T
Taking the FT
XT (f ) = XT sinc(T f )
The power spectrum
E[|XT (f )|2 ]
SX (f ) = lim = lim E[X 2 ]T sinc2 (T f )
T →∞ T T →∞

Note that E[X 2 ] = 1/3 and lim T sinc2 (T f ) = δ(f ). Thus,


T →∞

1
SX (f ) = δ(f )
3
Lecture 9 5/22
Power Spectrum of Stochastic Processes

Theorem: If X(t) is a stationary process, then


SX (f ) = F{RX (τ )}
To find the total power in the process:
Z∞ " #
1 T /2 2
Z
PX = SX (f )df = E lim X (t)dt
T →∞ T −T /2
−∞

Previous Example:
Let the random process X(t) be defined by X(t) = X, where X
is a random variable uniformly distributed on [−1, 1].
1
RX (τ ) = E[X(t + τ )X(t)] = E[X 2 ] =
3
Hence,  
1 1
SX (f ) = F = δ(f )
3 3
Lecture 9 6/22
Transmission over LTI Systems

We have seen that


Z∞
mY = mX h(t)dt
−∞

RXY (τ ) = RX (τ ) ∗ h(−τ )
RY (τ ) = RX (τ ) ∗ h(τ ) ∗ h(−τ )
Translation of these relations into the frequencyR domain is straight-

forward. Noting that F{h(−τ )} = H ∗ (f ) and −∞ h(t)dt = H(0),
we have
mY = mX H(0)
SXY (f ) = SX (f )H ∗ (f )
SY (f ) = SX (f )|H(f )|2
Since RXY (τ ) = RY X (−τ ), we have

SY X (f ) = SXY (f ) = SX (f )H(f )
Lecture 9 7/22
Example:
The random process X(t) is defined by X(t) = A cos(2πf0 t + θ),
where θ is a random variable uniformly distributed on [0, 2π).
X(t) passes through a differentiator, find SY (f ) and SXY (f )
Solution: Recall that H(f ) = j2πf and

A2
RX (τ ) = cos(2πf0 τ )
2
Taking the FT

A2
SX (f ) = [δ(f − f0 ) + δ(f + f0 )]
4
Hence,
SY (f ) = π 2 f 2 A2 [δ(f − f0 ) + δ(f + f0 )]

−jA2 πf
SXY (f ) = (−j2πf )SX (f ) = [δ(f − f0 ) + δ(f + f0 )]
2
Lecture 9 8/22
Power-Spectral Density of a Sum Process

In practice, we often encounter the sum of two random processes.


For example, in the case communication over a channel with
additive noise.
Assume that Z(t) = X(t) + Y (t) where X(t) and Y (t) are jointly
stationary random processes. Z(t) is a stationary process with

RZ (τ ) = RX (τ ) + RY (τ ) + RXY (τ ) + RY X (τ )

Taking the FT of both sides

SZ (f ) = SX (f ) + SY (f ) + SXY (f ) + SY X (f )
| {z }
∗ (f )
SXY
= SX (f ) + SY (f ) + 2Re[SXY (f )]

Lecture 9 9/22
Gaussian Processes

Gaussian processes play an important role in communication sys-


tems.
Gaussian Random Variable: The Gaussian random variable with
mean m and variance σ 2 has the following pdf
1 (x−m)2
fX (x) = √ e− 2σ2
2πσ

Lecture 9 10/22
Gaussian Processes cont’d

Jointly Gaussian Random Variables:


Define the random vector X = (X1 , X2 , ..., Xn ), and the vector
of means m = (m1 , m2 , ..., mn ), and the n × n covariance matrix
C such that
Ci,j = cov(Xi , Xj ),
then the random variables {Xi } are jointly Gaussian if
 
1 1 −1 T
f (x1 , x2 , ..., xn ) = p exp − (x − m)C (x − m)
(2π)n det(C) 2

Gaussian Processes:
A random process X(t) is a Gaussian process if for all n and
all (t1 , t2 , ..., tn ), the random variables {X(ti )}ni=1 have a jointly
Gaussian density function.

Lecture 9 11/22
Gaussian Processes cont’d

I For a Gaussian processes, knowledge of the mean and


autocorrelation; i.e, mX (t) and RX (t1 , t2 ) gives a complete
statistical description of the process.
I If the Gaussian process X(t) is passed through an LTI
system, then the output process Y (t) will also be Gaussian
process.
I For Gaussian processes, WSS and strict stationarity are
equivalent.
I For jointly Gaussian processes, uncorrelatedness and
independence are equivalent.

Lecture 9 12/22
White Processes

All frequency components appear with equal power!

A process X(t) is called a white process if it has a flat spectral


density, i.e., if SX (f ) is constant for all f .
Power spectral density of a white process:

I Thermal noise can be closely modeled as a white process


I Information sources are modeled as the output of LTI
systems driven by a white process
R∞
The power content of white process: PX = −∞ SX (f )df = ∞
Lecture 9 13/22
White Processes cont’d

Thermal Noise

k: Boltzmann’s constant,1.38 1023 Joules/Kelvin


T: temperature in degrees Kelvin

N0
The value kT is usually denoted by N0 , so Sn (f ) = 2
The autocorrelation function for a white process:
 
−1 N0 N0
Rn (τ ) = F = δ(τ )
2 2

I If we sample a white process at two points t1 and t2


(t1 6= t2 ), the resulting random variables will be
uncorrelated.
Lecture 9 14/22
Example:
Let X(t) be a white noise and
r ZT
2
Y (t) = X(t) cos(2πfc t)dt
T
0

Find the power of the modulated process Y (t).


" #
T RT
RY (t1 , t2 ) = E T2
R
X(t1 )X(t2 ) cos(2πfc t1 ) cos(2πfc t2 )dt1 dt2
0 0

2
RT RT
= T E [X(t1 )X(t2 )] cos(2πfc t1 ) cos(2πfc t2 )dt1 dt2
0 0
2
RT RT N0
= T 2 δ(t1 − t2 ) cos(2πfc t1 ) cos(2πfc t2 )dt1 dt2
0 0
2 N0
RT N0
= T 2 cos2 (2πfc t)dt = 2
0

Lecture 9 15/22
Noise-Equivalent Bandwidth

white Gaussian noise ⇒ F ilter ⇒ Gaussian, not white

The filter characteristics determine the spectral properties of the


output process.
N0
SY (f ) = SX (f )|H(f )|2 = |H(f )|2
2
The power content of the output process:
Z ∞
N0 ∞
Z
PY = SY (f )df = |H(f )|2 df
−∞ 2 −∞
Define Bneq , the noise-equivalent bandwidth of a filter with fre-
quency response H(f ) as
R∞
|H(f )|2 df
Bneq = −∞ 2
2Hmax
Hmax : the maximum of |H(f )| in the passband of the filter.
Lecture 9 16/22
Noise-Equivalent Bandwidth cont’d

For a typical filter:

Thus,
2
PY = N0 Bneq Hmax
The noise-equivalent bandwidth of filters and amplifiers is usually
provided by the manufacturer.

Lecture 9 17/22
Example:
Find the noise-equivalent bandwidth of a lowpass RC filter.
1
Recall the frequency response for this filter H(f ) = 1+j2πf RC
Define τ = RC

1
Answer: Bneq = 4RC

Lecture 9 18/22
Bandpass Processes

Bandpass random processes are the equivalents


of bandpass deterministic signals.

X(t) is a bandpass process,


if SX (f ) = 0 for |f − f0 | ≥ W , where W < f0

We have seen earlier that bandpass signals can be expressed in


terms of equivalent lowpass signals
⇒ Generalize those results to the case of random processes.
Lecture 9 19/22
Bandpass Processes cont’d

Let X(t) be a bandpass process. Then RX (τ ) is a deterministic


bandpass signal. If X(t) is passed through the Hilbert filter with
1
impulse response πt , the output process is the HT of the input
process with
Z∞
1
mY = mX dt
πt
−∞

1
RXY (τ ) = RX (τ ) ∗ = −R̂X (τ )
−πt
RY (τ ) = RX (τ ) Exercise!
Parallel to the deterministic case, define

Xc (t) = X(t) cos(2πf0 t) + X̂(t) sin(2πf0 t)


Xs (t) = X̂(t) cos(2πf0 t) − X(t) sin(2πf0 t)

Lecture 9 20/22
Bandpass Processes cont’d

X(t) = Xc (t) cos(2πf0 t) − Xs (t) sin(2πf0 t)


Properties:
1. If X(t) is zero-mean, then Xc (t) and Xs (t) are also
zero-mean.
2. If X(t) is Gaussian, then Xc (t) and Xs (t) are jointly
Gaussian.
3. If X(t) is WSS, then Xc (t) and Xs (t) are jointly WSS with

RXc (τ ) = RXs (τ ) = RX (τ ) cos(2πf0 τ ) + R̂X (τ ) sin(2πf0 τ )
RXc Xs (τ ) = RX (τ ) sin(2πf0 τ ) − R̂X (τ ) cos(2πf0 τ )

Note the cross correlation function is an odd function,


RXc Xs (0) = 0. This means that at any time instant t0 , the
rvs Xc (t0 ) and Xs (t0 ) are uncorrelated.
Lecture 9 21/22
Bandpass Processes cont’d

4. Power spectrum of the in-phase and quadrature


components

SX (f − f0 ) + SX (f + f0 ), |f | < f0
SXc (f ) = SXs (f ) =
0, otherwise

The cross spectral density

SXc Xs (f ) = F[R
 Xc Xs (τ )]
j [SX (f + f0 ) − SX (f − f0 )] , |f | < f0
=
0, otherwise

5. The power contents

PXc = PXs = RXc (τ )|τ =0 = RX (0) = PX

Lecture 9 22/22

You might also like