Analyzing Time-Varying Noise Properties With Spectrerf
Analyzing Time-Varying Noise Properties With Spectrerf
Analyzing Time-Varying Noise Properties With Spectrerf
Joel R. Phillips Cadence Design Systems San Jose, CA 95134 October 21, 1998
1 Overview
RF circuits are usually driven by periodic inputs. The noise in RF circuits is generated by sources that can therefore typically be modelled as periodically time-varying. Noise that has periodically time-varying properties is said to be cyclostationary. In order to allow more detailed characterization of cyclostationary noise in RF circuits, in SpectreRF 4.4.3 a Noise Type parameter is added to the pnoise analysis. Noise type sources computes the total time-average noise at an output over a given frequency range. Each noise source's contribution to the noise is computed at each frequency. This option is the default and represents the functionality present in PNOISE in previous releases. Noise type timedomain can be used to compute the time-varying instantaneous noise power in a circuit with periodically driven components. Noise type correlations can be used to compute correlations in noise at di erent ports of a multi-port circuit. For example, the correlation of noise at di erent outputs, or the correlation of noise at the input and output of a circuit that exhibits frequency conversion, can be computed. Equivlent noise sources can be extracted from these calculations.
2
1 0.8 0.6
WGN
LPF p(t)
RF
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Figure 1: Left: Very simple mixer schematic. Right: Representation of periodically-modulated noise. The solid line shows the envelope pt that modulates the noise process. The circles show possible phase points on the envelope where the time-varying noise power might be calculated. Noise in circuits that are periodically driven, say with period T , exhibits statistical properties that also vary periodically. To understand time-domain characterization of noise, consider the 3
1
6
0.5
1.5
2.5
3.5
4.5
0.5
1.5
2.5
3.5
4.5
Figure 2: Noise processes for two di erent phases in the periodic interval. Each process is stationary. simple circuit shown in Figure 1. The amplitude of the noise measured at the RF output will periodically vary depending on the magnitude of the modulating signal pt, as shown by the sample points on the right of Figure 1. SpectreRF can calculate the time-varying noise power at any point in the fundamental period. In fact, SpectreRF can calculate the full autocorrelation function R p; q = E fxpx p + qg = R q and its spectrum for the discrete-time processes x obtained by periodically sampling the time-domain noise process at the same point in phase. Figure 2 shows two such processes. See the Appendices for a more detailed introduction to noise in periodically time-varying systems. The following steps describe how to use SpectreRF to calculate time-domain noise. 1. Start the icms environment. 2. Enable the time-domain noise features by the executing the command artDebug- spectreTdNoiseActive=t in the CIW window. 3. Perform the PSS analysis setup as usual. 4. On the PNOISE analysis form, select noisetype
timedomain
5. Select an appropriate frequency range and sweep for the analysis, for example, a linear sweep up to the fundamental frequency. Since each time point in the calculation is a separate frequency sweep, in order to minimize the computation time, be sure to use the minimum number of frequency points you expect to be able to use to resolve the spectrum. 6. Select an appropriate set of timepoints for the time-domain noise analysis by either specifying a noiseskipcount or by adding explicit timepoints. If noiseskipcount is set to 4
7. 8.
9. 10.
an integer p, then the noise will be calculated at every p + 1 points. noiseskipcount=0 calculates the noise at every timepoint in the nal PSS solution and is the default. Speci c points can be added by specifying a time relative to the start of the PSS simulation interval. noiseskipcount=5 will perform noise calculations for about 30 points in the PSS interval. If only a few points are needed, add them explicitly and set noiseskipcount to a large value like 1000. Perform the simulation. To calculate time-varying noise power, bring up the Direct Plot PSS results form. Click on tdnoise and then select Integrated noise power. Enter 0 as the start frequency and the PSS fundamental frequency as the stop period, for example, 1G, if the PSS period is 1ns. A periodic waveform should appear that represents the expected noise power at each point in the fundamental period. To display the spectrum of a the sample processes, enable Output Noise and select Spectrum as the type of sweep. After clicking on Plot, a set of curves will appear, one for each sample phase in the fundamental period. To calculate the autocorrelation function for one of the sample processes, rst display the spectrum as instructed in the previous step. Next, bring up the calculator tool by selecting Tools- Calculator from the Analog Artist main window. Bring one of the sample waveforms into the calculator by clicking wave on the calculator and the selecting the appropriate frequency-domain spectrum. Pull up the DFT special function in the calculator. Set 0 as the From and the PSS fundamental as the To. Choose an appropriate window e.g., Cosine2 and number of samples around the number of frequency points in the interval 0; 1=T , then apply the DFT and plot the results. Harmonic q of the DFT results gives the value of the discrete autocorrelation for this sample phase, Rq. Note: be p sure the noise is in the correct units of power e.g., V =Hz, not V= Hz before performing the DFT to obtain the autocorrelation.
2
SXX ! = X !X ! where X ! is the Fourier transform of the signal xt. For random signals like noise, we calculate the expected value of the power spectrum SXX !. To characterize the relation between two separate signals xt and yt, we also need the cross-power spectrum
5
VN1
+
N2
Noisy
Noiseless
1. Start the icms environment. 2. Enable the noise correlation features by the executing the command artDebug- spectreTdNoiseActive=t in the CIW window. 6
3. Perform the PSS analysis setup as usual. 4. On the PNOISE analysis form, select noisetype correlations. 5. Choose the rst output of the circuit. This can be a port or voltage if you wish to calculate noise parameters starting in the impedance representation, or a current if you wish to use the admittance representation. For example, in a mixer, you might choose the IF output here. 6. Select an appropriate frequency range and sweep for the analysis. For example, in the mixer you might want to sweep 100MHz around the IF output. 7. Select an appropriate set of cycles" to use for the analysis. For example, in a mixer, you might set maxcycles=1 or add cycle 1" if you have selected the IF as output, since that will calculate noise correlations between the IF output at frequency ! and the noise at RF input frequency ! + ! , where ! is the LO frequency. 8. Perform the simulation. 9. No support for noise correlation analysis is available through DirectPlot in release 4.4.3. To display the results, call up the Results Browser from the Tools pulldown menu in Analog Artist. Select the pnoise analysis and the appropriate cycle of interest correlations of noise at di erent frequencies appears as di erent harmonics" in the browser. A list of nodes and sources will appear. 10. The correlation of the noise at the output speci ed in the Analog Artist form with the noise induced at each of the nodes or sources has been calculated and is available to directly plot or further process in the calculator. 11. If you wish to analyze the correlations with noise at a di erential input output, simply subtract the cross-power spectra appearing on the two nodes. 12. Note that the noise correlations analysis outputs all results in units of power e.g., V =Hz, and that in general the cross-power spectra are complex.
0 0 2
4 Examples
As an example which illustrates the various aspects of cyclostationary noise, consider the system shown in Figure 4. White Gaussian noise is passed through a high-order band-pass lter with center frequency ! , then multiplied by two square-waves which have a phase shift a with respect to each other. Finally the output of the ideal multipliers put through a one-pole low-pass lter to produce an I and Q output. First the time-domain behavior of the noise was examined. The most dramatic e ect can be seen by looking directly at the mixer outputs. Figure 7 shows the contribution to the time-varying noise power from three separate source frequencies. Two of the frequencies were taken around ! , and one away from ! , slightly into the stop band of the band-pass lter. The sharp change
0 0 0
1.5
10
10
10
0.5
10
0 0.02
0.04
0.06
0.08
0.1
0.12
0.14
25
20
15
10
10
15
20
25
Figure 5: Simple mixer example, time-varying noise power at mixer output. Left: Time-varying noise power, before low-pass lter, for three di erent noise contributor frequencies. Right: Spectrum of a sampled noise process. Note the periodically replicated spectrum.
1.4
1.4
1.2
1.2
0.8
0.8
0.6
0.6
0.4
0.4
0.2
0.2
0 15
10
10
0 15 15
10
10
15
Figure 6: Simple mixer example power spectrum for I output, SII ! solid, and crossspectral density for I and Q, SIQ! dash. Left: spectra with LO tones 90 out of phase. Right: spectra with LO tones 72 out of phase
0 0 deg deg
in noise power over the simulation interval is because the mixers were driven with square-wave LO signals. Next the noise behavior at the output ports was examined. The output spectra at I and Q are shown in Figure 5. The noise density at I is concentrated around zero because the noise at the RF input to the mixers band-limited around ! as shown in the left part of Figure 6 is shifted down to zero and up to to 2! , but components not around zero are eliminated by the low-pass lter. More interesting is the cross-correlation spectrum of the I and Q outputs, shown as the dashed line in Figure 5. When the signals applied to the mixers are 90 degrees out of phase, the cross-power spectral density of the noise at the separate I and Q outputs is small, indicating little correlation of the noise. If the tones are not quite out of phase, the correlation is much more pronounced, though in neither case is it completely zero.
0 0
A more interesting example comes from examining the correlation between the noise at the I output and the noise at the RF inout. The density function SIR ! is signi cant because it represents the correlation between the noise at the I output around the baseband frequency with the noise at the RF input, ! higher in frequency. The correlation is high because the noise at the RF input is centered around ! and converted to zero-centered noise by the mixer.
1 0 0
Finally a detailed circuit example was considered. A transistor-level image-reject receiver with I and Q outputs was analyzed. The noise spectra at the I and Q outputs were found to be very similar, as shown in the left part of Figure 8. The cross-power density was smaller, but not negligible, indicating that the noise at the two outputs is partially correlated, as shown in the right hand part of Figure 8. 9
10
10
10
1 0
10
10
2
10
10
10
4 3
10
10
5
10
10
10
7 6
10
10
10
6
9 8
10 15 10
10 5 0 5 10
10
15
10 15
15
Figure 7: Simple mixer example spectral densities. Left: Noise spectrum at RF input, SRR!. Right: Various power densities. Solid: cross-power spectrum SIR ! indicating correlation between output noise power at I output versus noise at RF input that is 1 harmonic higher in frequency. Dash: Noise spectrum SII ! at I output. Dash-dot: noise spectrum SRR at RF input.
0 1 0 0
x 10
13
0.35
0.3
0.25
0.2
4
0 0 0.5 1
10
15
20
25
MHz
Figure 8: Left: Image-reject receiver example, power spectral densities of I output solid, Q output dash, and IQ cross-power dash-dot. Right: Correlation coe cient between noise at I and Q outputs of an image-reject receiver.
10
0.4 0.2 0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
0.4 0.2 0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Figure 9: A deterministic current source driving a noisy linear resistor. Now consider Figure 10, where we restrict attention to the noise source resistor pair alone. A typical measured noise current voltage is shown in the top left portion of the gure. Since we cannot predict the speci c value of nt at any point, we might instead try to predict what its value would be on average, or what we might expect the noise to be. For example, if we measure many noise voltage curves in the time domain, nt, and average over many di erent curves, we 11
0.2
0.1
0.1
0.2
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
0.2
0.1
0.1
0.2
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Figure 10: Resistor modeled as noiseless resistance and equivalent noise current source. will obtain an approximation to the expected value of nt which we denote by nt . For thermal noise, we will nd that nt = 0 . Therefore, instead of computing nt , let us instead compute nt , the expected noise power. An example of this sort of measurement is shown in the bottom right portion of Figure 10; 250 measurements were needed to compute this curve. Now suppose that we wish to tap the circuit at multiple points. Each will have its own noise characteristics, but they are not necessarily independent. Consider the circuit shown in Figure 11a. The signals n t and n t are obtained by measuring the voltage across a single resistor, and across both resistors, respectively. Just measuring n t and n t is not enough to predict the behavior of this system, because n t and n t are not indepenent. To see this, consider Figures 11 a and b. Samples of each of the processes are taken and plotted on an X-Y graph. Because n t composes part of n t, n t and n t are correlated, the X-Y plot has a characteristic skew, relative to the n t; n t plot, along the X = Y line. n t and n t are uncorrelated because they represent thermal noise from di erent sources. The additional measurement needed to describe the random processes is the measurement of the correlation between the two processes, n tn t . We can also de ne a time-varying correlation coe cient , with 2 0; 1 , as t = q n tn t n t n t
2 1 2 1 2 2 2 1 2 1 2 1 2 1 3 1 3 1 2 1 2 1 2 2 2
A value of = 0 indicated completely uncorrelated signals, and a value near one indicates a high degree of correlation. In this example we would nd that t = 1=2, representing the fact that each of the two noise sources contributes half of the process n t. When there are multiple variables of interest in the system, it is convenient to use matrix notation. We write all the random processes of interest in a vector, for example " x t xt = x t
2 1 2
12
0.25
0.25
n3
0.2
0.2
0.15
0.15
0.1
0.1
(a)
n2
0.05
0.05
0.05
0.05
0.1
0.1
n1
0.15
0.15
0.2
0.2
0.25 0.25
0.2
0.15
0.1
0.05
0.05
0.1
0.15
0.2
0.25
0.25 0.25
0.2
0.15
0.1
0.05
0.05
0.1
0.15
0.2
0.25
Figure 11: aCircuit illustrating correlated noise. b Samples of n t plotted versus n t. c Samples of n t plotted versus n t.
1 3 1 2
0.2
0.1
0.1
0.2
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
0.2
0.1
0.1
0.2
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Figure 12: Noise in a simple RC circuit. and then we can write the correlations as the expected value of a vector outer product, xtxH t , where the H superscript indicates Hermitian transpose. For example, we may write a time-varying correlation matrix as
Rxxt; t
xtxH t
"
The preceding examples contain all we need to know about characterizing purely static systems. Now we need to add some elements with memory, such as inductors and capacitors. 13
1.5
0.5
20
40
60
80
100
120
1.5
0.5
20
40
60
80
100
120
As an rst example, consider, as shown in Figure 12, adding a capacitor in parallel to the simple resistor we have been considering. A sample of the noise process in shown in Figure 12b. The noise looks di erent than the noise of the resistor alone, because the low-pass lter action of the RC circuit eliminates very high frequencies in the noise. However, we cannot see this e ect simply by measuring nt , as shown in Figure 12c. nt is independent of time, just as for the resistor! Instead, let us look at the expected power density in the frequency domain. Let n! denote the Fourier transform of one sample of nt. Then, n!n! is the expected power spectral density, which we denote by Sn!. In the present case, the capacitor has a pronounced e ect on the spectral density. Figure 13a shows a computed power spectral density for the resistor thermal noise previously considered. The spectrum is essentially at some deviations occur because a nite number of samples was taken to perform the calculation. The at spectrum represents the fact that in the resistor's noise, all frequencies are, in some statistical sense, equally present. We call such a process white noise." Figure 13b shows the spectrum of the noise process after ltering by the resistorcapacitor system. It is easy to rigorously account for the e ect of the RC- lter on the power spectrum of the noise signal. Suppose a random signal x is passed through a time-invariant linear lter with frequency-domain transfer function h!. Then the output is y! = h!x! Because expectation is a linear operator, we can easily relate the power spectral density of y, Sy ! to Sx!, the power spectral density of x, by using the de nitions of y and power density. Speci cally, Sy ! = y!y! = h!x!x!h! = jh!j Sx! The noise from the resistor can be considered to be generated by a noise current source i, with power density B Si! = 4kR T ;
2 2
14
placed in parallel with the resistor With the capacitor in parallel, the transfer function from the current source to the resistor voltage is just the impedance Z !, 1=C h! = Z ! = j! + RC and so the noise voltage power density is
1
Sn! =
kB T RC
2 1 !2 + RC
4
Clearly the spectrum is attenuated at high frequencies and reaches a maximum near zero. For a vector process, we may de ne a matrix of power-spectral densities, Sxx! x!xH ! The diagonal terms are simple real-valued power densities, and the o -diagonal terms are generally complex-valued cross-power densities between two variables. The cross-power density gives a measure of the correlation between the noise in two separate signals at a speci c frequency. We may de ne a correlation coe cient as Sxixj ! ij ! h i= Sxi !Sxj !
1 2
It is often more useful to examine the correlation coe cient because the cross-power density may be small. As an example, consider a noiseless ampli er. The noise at the input is simply a scaled version of the noise at the output leading to a = 1, but the cross-power density will be much smaller than the output total noise power density if the ampli er has small gain. In the vector case, the transfer function is also a matrix H !, such that y! = H !x! and so the spectral densities at the input and output are related by Syy ! = H !x!xH !H H ! = H !Sxx !H H !
1
15
0.2
p(t)
0.1
1.8
1.6
0.1
1.4
0.2
n(t)
0.2
0.8
0.1
0.6
0.4
0.1
0.2
0.2
0.2
0.4
0.6
0.8
20
40
60
80
100
120
Figure 14: Noise in a time-varying circuit. However, most interesting RF circuits contain nonlinear elements driven by time-varying signals. This introduces time-varying noise sources as well as time-varying ltering. Because most noise sources are small, and generate small perturbations to the circuit behavior, for purposed of noise analysis, most RF circuits can be e ectively modelled as linear time-varying systems. The simple matrix of power spectra is not su cient to describe these systems. To see this, return to the simple resistor example. Suppose that a switch is connected between the resistor and the voltage measuring device, and that the switch is periodically opened and closed. Then the switch is open, there will be no noise measured, and when the switch is closed, the thermal noise will be seen at the voltage output. A typical noise waveform is shown in Figure 14a. The time-varying noise power nt can be computed and is shown in Figure 14b. The expected power periodically switches between zero and the value expected from the resistor noise. This is di erent than the resistor-only and resistor-capacitor systems considered previously. Indeed, no linear time-invariant system could create this behavior. However, if we examine the power spectrum, we again nd that it is at, correponding to white" noise!
2
At this point it is clear that nt and nt do not completely specify the random process nt, nor does the power spectral density. To obtain a complete characterization, consider measuring nt at two di erent timepoints, t and t . nt and nt are two separate random variables. They may be independent of each other, but in general they will have some correlation. Therefore, to completely specify the statistical characteristics of nt and nt together, we must specify not only the variances nt and nt , but also the covariance nt nt . In fact since nt has in nite dimension, an in nite number of these correlations must be speci ed to charaterize the entire random process. The usual way of doing this is by de ning the autocorrelation function Rn t; t + = ntnt + If xt is a vector process, " x t xt = x t
2 2 1 2 1 2 2 2 1 2 1 2 1 2
16
Rxxt; t +
xtxH t +
"
x tx t + x tx t +
1 2 1 1 1
x tx t + x tx t +
1 2 2 2
where superscript H indicates Hermitian transpose. The diagonal terms gives the autocorrelation function for a single entry of the vector, e.g, x txtt+ . For = 0, this is the time-varying power in the single process, e.g. x t . If the process xt is Gaussian, it is completely characterized by its autocorrelation function Rxt; t + since all the variances and co-variances are now speci ed. We can also precisely de ne what it means for a process to be time-independent", or stationary: a stationary process is one whose autocorrelation function is a function of only, not of t. This means that not only is the noise power" nt independent of t, but the correlation of the signal at a time point with the signal at another timepoint is only dependent on the di erence between the timepoints, . The white noise generated by the resistor, and the RC- ltered noise, are both stationary processes.
1 2 2
At di erent points in the discussion above it was claimed that the expected time-varying power nt of the resistor voltage is constant in time, and also the power density Sn ! is constant in frequency. At rst this seems odd because a quantity that is broad" in time should be concentrated" in frequency, and vice versa. The answer comes in the precise relation of the spectral density to the autocorrelation function. Indeed, it turns out that the spectral density is the Fourier transform of the autocorrelation function, but with respect to the variable , not with respect to t. In other words, the measured spectral density is related to the correlation of a random process with time-shifted versions of itself. Formally, for a stationary process Rnt; t + = Rn we write
Sn
Z1 f = ei!
,1
Rn d
From inspecting this expression we can see that what is happening is that adding a capacitor to the system creates memory. The random current process generated by the thermal noise of the resistor has no memory of itself so the currents at separate time-instants are not correlated. However, if the current source adds a small amount of charge to the capacitor, the charge takes a nite amount of time to discharge through the resistor creating voltage. Thus voltage at a time-instant is correlated with the voltage at some time later, because part of the voltage at the two separated time instants is due to the same bit of added charge. From inspecting the autocorrelation function it is clear that the correlation e ects last only as long as the time it 17
For example, in the resistor-capacitor system considered above, we can calculate the autocorrelation function Rn by an inverse Fourier transform of the power spectral density, with the result ! 4kB T e,j j=RC Rn = C
takes any particular bit of charge to decay, in other words, a few times the RC time constant of the resistor-capacitor system. Note that the process is still stationary because this memory e ect depends only on how long has elapsed since the bit of charge has been added, or rather how much time the bit of charge has had to dissipate, not the absolute time at which the charge is added. Charge added at separate times is not correlated since arbitrary independent amounts can be added at at given instant. In particular, the time-varying noise power,
nt
is still constant in time.
Z1
,1
Sn !d!
Now we have seen that the variation of the spectrum in frequency is related to the correlations of the process, in time. We might logically expect that, conversely, variation of the process in time that is, non-stationarity might have something to due with correlations of the process in frequency. To see why this might be the case, suppose we could write a random process x as a sum of complex exponentials with random coe cients,
x=
K X
K X k=,K K X
ck ei!t
x t
2
k=,K l=,K
ck c ei wk ,wl t l
2
is constant in time if and only if ck c = jck j kl L In other words, the coe cients of expansion of sinusoids of di erent frequencies must be uncorrelated. In general, a stationary process is one whose frequency-domain representation contains no correlations across di erent frequencies. To see how frequency correlations might come about, let us return to the resistor-switch example. Let nt denote the voltage noise on the resistor, and ht the action of the switch, so that the measure voltage is given by vt = htnt, where ht is periodic with period T and frequency ! = 2=T . The time-domain multiplication of the switch becomes a convolution in the frequency domain, v! = h! n! where denotes convolution. Since ht is periodic, its frequency-domain representation is a series of Dirac deltas, X h! = hk ! , k!
2 0
18
and so
X
k
hk n! , k!
0
hk h n! , k! n! , l! l
0 0
Sv ! =
X
k
jhk j Sn! , k!
2 0
Since Sn ! is constant in frequency, Sv ! is also. However, the process v is no longer stationary because frequencies separated by multiples of ! have been correlated by the action of the time-varying switch. We may see this e ect in the time-variation of the noise power, as in Figure 14, or we may examine the correlations directly in the frequency domain. To do this, we introduce the cycle spectra Sxx! that are de ned by
0
Sxx! = x!xH ! +
and are a sort of cross-spectral density, taken between two separate frequencies. S ! is just the power spectral density we have previously discussed. In fact we can de ne a frequency-correlation coe cient as Sn ! n ! Sn !Sn! + = and if n ! = 1, then the process n has frequency content at ! and ! + that is perfectly correlated. Consider separating out a single frequency component of a random process and multiplying by a sinusoidal waveform of frequency , as shown in Figure 15. The component at ! is shifted to re-appear at ! + and ! , . The new process' frequency components at ! , and ! + are deterministically related to the components of the old process located at !. Therefore, they are correlated, and S ! is non-zero. Physically, what happens is that to form a waveform with a de ned shape" in time, the di erent frequency components of the signal must add in a coherent, or correlated fashion. In a process like thermal noise, the Fourier coe cients at di erent frequencies have phase that is randomly distributed with respect to each other, and the Fourier components can only add incoherently. Their powers add, rather than their amplitudes. Frequency correlation and timevariation of statistics are thus seen to be equivalent concepts. Another way of viewing the cycle spectra is that they represent, in a sense, the two-dimensional Fourier transform of the autocorrelation function, and are therefore just another way of expressing the statistics of the process.
0 1 2 2
19
Again supposing the signal n to be cyclostationary with period T , for each sample phase 2 0; T , we may de ne the discrete-time autocorrelation function R p; q to be n R p; p + q = Rn + pT; + p + qT n Because the cyclostationary process Rn is periodic, by inspection R p; p + q is independent of n p and thus stationary, i.e. R p; p + q = R q n n p; p = R 0 gives the expected noise power, R ; , for the signal at phase . Note that Rn n 0 versus will show how the noise power varies periodically with time. Plotting R The discrete-time process R p; p + q = R q can be described in the frequency-domain by n n its discrete Fourier transform, 1 X R = R qeiq T n
2
Note that the spectrum of the discrete sampled process R is periodic in frequency with n period 1=T . All noise power is aliased into the Nyquist interval ,1=2T; 1=2T or, equivalently, the interval 0; 1=T . Generally it is the noise spectrum which is available from the circuit simulator. To obtain the autocorrelation function or time-varying noise power, an inverse Fourier integral must be calculated. R q is given by n
q=,1
R q = n
Z 1=T
0
R eiq d n
2
D Summary
1. All useful noise metrics can be interpreted in terms of correlations. Physically these can be interpreted as expected value of two-term products. In the case of random vectors these are expected values of vector outer products. 20
2. The power spectral density of a variable indexed i is Sxixi ! = xi!xi! . This is what the current SpectreRF pnoise analysis computes. 3. Sxx! is constant if and only if x is a white noise process. In that case Rxx = R are there are no correlations in time for the process. 4. The cross-power densities of two variables xi; xj are Sxixj ! = xi!xj !H . Sxixj is zero if and only if the two variables have zero correlation at that frequency. A correlation coe cient may be de ned as Sxixj ! i= ij ! h Sxi !Sxj ! and ij f 2 0; 1 . 5. The cycle-spectra Sxxf represent correlations between frequencies separated by the cyclefrequency . Sxxf = x!xH ! + For a single process xi, a correlation coe cient may be de ned as Sxixi ! xi ! Sxi !Sxi ! + = and xi f 2 0; 1 . 6. A process is stationary if and only if Sxx! is zero for all ! and all 6= 0, that is, if there are no correlations in frequency for the process. In other words, Sxx! = Sxx! . 7. A process is cyclostationary if Sxx is zero for all 6= m! for some w and integer m. Frequencies separated by m! are correlated. A stationary process passed through a periodically linear-time varying lter will in general be cyclostationary with w the fundamental harmonic of the lter. 8. We may also compute correlations between di erent nodes at di erent frequencies, with the obvious interpretation and generalization of the correlation coe cients.
1 2 1 2 0 0 0 0
References
1 R. Telichevesky, K. S. Kundert, and J. K. White, E cient AC and noise analysis of two-tone RF circuits," in Proceedings of the 1996 Design Automation Conference, June 1996. 2 J. A. Dobrowski, Introduction to computer methods for microwave circuit analysis and design. Boston: Artech House, 1991. 3 J. Roychowdhury, D. Long, and P. Feldmann, Cyclostationary noise analysis of large RF circuits with multitone excitations," IEEE Journal Solid-State Circuits, vol. 33, pp. 324 336, March 1998. 21
22