\section{Signals} \subsection{Classification} \begin{figure}[h] \centering \begin{tikzpicture}[ nodes = { thick, draw = black, fill = lightgray!20, align = center, inner sep = 2mm, outer sep = 1mm, }, sibling distance = 3cm, ] \node {All signals} child {node {Class 1 \\ \(0 < E_n < \infty\)}} child { node {Class 2 \\ \(0 < P_n < \infty\)} child {node {Class 2a \\ periodic}} child {node {Class 2b \\ stochastic}} } ; \end{tikzpicture} \end{figure} \subsection{Properties} For class 2b signals the formula for class 2a signals can used by taking \(\lim_{T\to\infty} f_\text{2a}(T)\) (if the limits exists). The notation \(\int_T\) is short for an integral from \(-T/2\) to \(T/2\). \begin{table}[h] \everymath={\displaystyle} \[ \begin{array}{l l} \toprule \text{\bfseries Characteristic} & \text{\bfseries Symbol and formula} \\[6pt] \text{\itshape Class 1 Signals} \\ \midrule \text{Normalized energy} & E_n = \lim_{T\to\infty} \int_T |x|^2 \,dt \\[6pt] \text{\itshape Class 2a Signals} \\ \midrule \text{Normalized power} & P_n = \lim_{T\to\infty} \frac{1}{T} \int_T |x|^2 \,dt \\[12pt] \text{Linear mean} & X_0 = \frac{1}{T} \int_T x\, dt \\[12pt] \text{Mean square} & X^2 = \frac{1}{T} \int_T x^2 \, dt \\[12pt] n\text{-th order mean} & X^n = \frac{1}{T} \int_T x^n \, dt \\[12pt] \text{Rectified value} & |\bar{X}| = \frac{1}{T} \int_T |x| \,dt \\[12pt] \text{Variance} & \sigma^2 = \frac{1}{T} \int_T \left(x - X_0\right)^2 dt \\[12pt] & = X^2 - X_0 \\[6pt] \text{Root mean square} & X_\text{rms} = \sqrt{X^2} \\ \bottomrule \end{array} \] \end{table} \subsection{Correlation} \paragraph{Autocorrelation} The \emph{autocorrelation} is a measure for how much a signal is coherent, i.e. how similar it is to itself. For class 1 signals the autocorrelation is \[ \varphi_{xx}(\tau) = \lim_{T\to\infty} \int_T x(t) x(t - \tau) \,dt, \] whereas for class 2a and 2b signals \begin{gather*} \varphi_{xx}(\tau) = \frac{1}{T} \int_T x(t) x(t - \tau) \,dt \quad\text{(2a)}, \\ \varphi_{xx}(\tau) = \lim_{T\to\infty} \frac{1}{T} \int_T x(t) x(t - \tau) \,dt \quad\text{(2b)}. \end{gather*} Properties of \(\varphi_{xx}\): \begin{itemize} \item \(\varphi_{xx}(0) = X^2 = (X_0)^2 + \sigma^2\) \item \(\varphi_{xx}(0) \geq |\varphi_{xx}(\tau)|\) \item \(\varphi_{xx}(\tau) \geq (X_0)^2 - \sigma^2\) \item \(\varphi_{xx}(\tau) = \varphi_{xx}(\tau + nT)\) (periodic) \item \(\varphi_{xx}(\tau) = \varphi_{xx}(-\tau)\) (even, symmetric) \end{itemize} The Fourier transform of the autocorrelation \(\Phi_{xx}(j\omega) = \fourier \varphi_{xx}(t)\) is called \emph{energy spectral density} (ESD) for class 1 signals or \emph{power spectral density} (PSD) for class 2 signals. \paragraph{Cross correlation} The \emph{cross correlation} measures the similarity of two different signals \(x\) and \(y\). For class 1 signals \[ \varphi_{xy}(\tau) = \lim_{T\to\infty} \int_T x(t) y(t-\tau) \,dt. \] Similarly for class 2a and 2b signals \begin{gather*} \varphi_{xy}(\tau) = \frac{1}{T} \int_T x(t) y(t - \tau) \,dt \quad\text{(2a)}, \\ \varphi_{xy}(\tau) = \lim_{T\to\infty} \frac{1}{T} \int_T x(t) y(t - \tau) \,dt \quad\text{(2b)}. \end{gather*} Properties of \(\varphi_{xy}\): \begin{itemize} \item For signals with different frequencies \(\varphi_{xy}\) is always 0. \item For stochastic signals \(\varphi_{xy} = 0\) \end{itemize} \subsection{Amplitude density} The amplitude density is the probability that a signal has a certain amplitude during a time interval \(T\). \[ p(a) = \frac{1}{T}\frac{dt}{dx} \in [0,1] \]