Information Systems Data Book

Cambridge University Engineering Department

Information Systems Reference

Control systems, communication theory, information theory, and coding. Mathematical transforms and probability content has been consolidated in Mathematical Foundations.

Source: Information Data Book (2017 Edition, revised 2019 & 2021), Cambridge University Engineering Department


1. Control Systems

1.1 Closed-Loop System

Return ratio: L(s)=H(s)G(s)K(s)L(s) = H(s)G(s)K(s)

Closed-loop transfer function: G(s)K(s)1+L(s)\frac{G(s)K(s)}{1+L(s)}

1.2 Stability

Stable iff roots of 1+L(s)=01 + L(s) = 0 have negative real parts.

1.3 Routh-Hurwitz Criteria

For polynomial: ansn+an1sn1++a0a_n s^n + a_{n-1}s^{n-1} + \cdots + a_0

Second order: All ai>0a_i > 0

Third order: a3a2>a1a0a_3 a_2 > a_1 a_0

Fourth order: a3a2a1>a4a12+a32a0a_3 a_2 a_1 > a_4 a_1^2 + a_3^2 a_0

1.4 Nyquist Criterion

Encirclement of 1/k-1/k equals number of RHP poles of g(s)g(s).

1.5 Root Locus

Roots of 1+kg(s)=01 + k g(s) = 0.

Angle condition: g(s)=(2m+1)π\angle g(s) = (2m+1)\pi

Magnitude condition: g(s)=1k|g(s)| = \frac{1}{k}

1.6 Bode Diagrams

Standard first- and second-order forms (see plots in original databook).


2. Communication

2.1 Analogue Modulation

AM (Amplitude Modulation): s(t)=[a0+x(t)]cos(2πfct)s(t) = [a_0 + x(t)]\cos(2\pi f_c t)

FM (Frequency Modulation): s(t)=a0cos(2πfct+2πkF0tx(u)du)s(t) = a_0 \cos\left(2\pi f_c t + 2\pi k_F \int_0^t x(u) \, du\right)

Carson’s rule (FM bandwidth): B2W+2ΔfB \approx 2W + 2\Delta f

2.2 Digital Communication

Quantisation SNR: SNR=1.76+6.02n dB\mathrm{SNR} = 1.76 + 6.02n \text{ dB}

where nn is the number of bits.

PAM (Pulse Amplitude Modulation): x(t)=kXkp(tkT)x(t) = \sum_k X_k p(t - kT)

QAM (Quadrature Amplitude Modulation): x(t)=k{Xkej2πfct}p(tkT)x(t) = \sum_k \Re\{X_k e^{j2\pi f_c t}\} p(t - kT)

2.3 Wireless Channel

If hCN(0,σ2)h \sim \mathcal{CN}(0, \sigma^2), then: h2Exponential(1σ2)|h|^2 \sim \text{Exponential}\left(\frac{1}{\sigma^2}\right)


3. Information Theory

3.1 Entropy

H(X)=xP(x)log1P(x)H(X) = \sum_x P(x) \log \frac{1}{P(x)}

3.2 Mutual Information

I(X;Y)=H(X)H(XY)=D(PXYPXPY)I(X;Y) = H(X) - H(X|Y) = D(P_{XY} \| P_X P_Y)

3.3 Differential Entropy

h(X)=p(x)log1p(x)dxh(X) = \int p(x) \log \frac{1}{p(x)} \, dx

3.4 Key Inequalities

Data-processing inequality: I(X;Y)I(X;Z)I(X;Y) \ge I(X;Z)

Fano’s inequality: 1+PelogXH(XY)1 + P_e \log|\mathcal{X}| \ge H(X|Y)


4. Coding Theory

4.1 Linear Block Codes

Rate: R=knR = \frac{k}{n}

Singleton bound: dminnk+1d_{\min} \le n - k + 1

4.2 LDPC Codes

Density evolution (BEC): pt=ελ(1ρ(1pt1))p_t = \varepsilon \lambda\big(1 - \rho(1-p_{t-1})\big)

LLR for AWGN: L(y)=2yσ2L(y) = \frac{2y}{\sigma^2}

4.3 Finite Fields and Reed-Solomon Codes

DFT over GF(qq): Xk=m=0n1xmαmkX_k = \sum_{m=0}^{n-1} x_m \alpha^{mk}

Inverse: xm=1nk=0n1Xkαmkx_m = \frac{1}{n^*} \sum_{k=0}^{n-1} X_k \alpha^{-mk}

Reed-Solomon codes are MDS with: dmin=nk+1d_{\min} = n - k + 1


Cross-References


Source

Information Data Book (2017 Edition, revised 2019 & 2021), Cambridge University Engineering Department