Chapter 1

Introduction to Statistical Signal Processing

Abdelhak M. Zoubir,    Signal processing Group, Technische Universität Darmstadt, Germany

Acknowledgments

I am grateful to the contributors as well as the colleagues who provided a critical feedback on those contributions.

3.01.1 A brief historical recount

Signals are either random in nature or deterministic, but subject to random measurement errors. Therefore, Statistical Methods for Signal Processing or in short Statistical Signal Processing are the signal processing practitioner’s choice to extracting useful information from measurements. Today, one can confidently claim that statistical signal processing is performed in every specialization of signal processing. Early stages of statistical signal processing concerned parameter (signal) estimation, signal detection, and signal classification. These have their roots in probability theory and mathematical statistics. For example, parameter estimation and signal detection are what is known under, respectively, point estimation [1] and statistical hypothesis testing [2,3] in mathematical statistics while fundamental concepts of classification were treated in classical texts such as [4].

It is difficult to trace back the time when the term statistical signal processing was established. Undoubtedly, statistical signal processing was performed even before the birth of digital signal processing, which started soon after the discovery of the Fast Fourier Transform by Cooley and Tukey [5]. Themes on statistical signal processing started as part of broader workshops in the early 1980s, such as the IEEE Workshop on Spectrum Estimation and Modeling, the IEEE International Workshop on Statistical Signal and Array Processing, or the IEEE International Workshop on Higher-Order Statistics, to mention a few. The year 2001 gave birth to the IEEE International Workshop on Statistical Signal Processing, which is organized biannually. Textbooks on the subject have been around since the 1990s. They include, for example [68].

3.01.2 Content

Today, statistical signal processing finds a wide range of cross-fertilization and applications, far beyond signal estimation, detection, or classification. Examples include communications and networking, target tracking, (adaptive) filtering, multi-dimensional signal processing, and machine learning, to mention a few. Recently, I came across a few newly edited books with the content statistical signal processing for neuroscience. The authors of the chapters from both signal processing and neural computation aim at promoting interaction between the two disciplines of signal processing and neural sciences. This is surely not the first and not the last attempt to bring two communities together. Signal Processing, and in particular statistical signal processing, is such a dynamic discipline that calls for cross-fertilization between disciplines. At no surprise, you will find a wide range of statistical signal processing treatment in other sections of this e-reference, such as in Signal processing for Communications or Signal Processing for Machine Learning.

I was honored and delighted when Sergios Theodoridis approached me to be the area editor for this section. I first thought of key advances in statistical signal processing. They are numerous, but space did not allow for all of these areas to be covered in this section. I had to select a subset and I approached not only the very best researchers in these areas, but who also are experienced in writing tutorial-style articles. Some of the contributors preferred to wait for the second edition of e-reference for their manuscript to appear.

3.01.3 Contributions

The contributions in this section cover chapters on recent advances in detection, estimation and applications of statistical signal processing. All of these chapters are written in a tutorial style.

3.01.3.1 Quickest change detection

An important problem in engineering practice, such as engine monitoring, is to detect anomalies or changes in the environment as quickly as possibly, subject to false alarm constraints. These changes can be captured by a change in distribution of the observed data. The authors provide two formulations of quickest change detection, a Bayesian and a minimax approach, along with their (asymptotically) optimal solutions. The authors also discuss decentralized quickest change detection and provide various algorithms that are asymptotically optimal. Several open problems in the context of quickest change detection have been given by the authors. Among these, the authors mention an old, but to date an unsatisfactorily solved problem with a large impact in statistical signal processing practice, that is, transient detection.

3.01.3.2 Distributed signal detection

Signal detection with a single sensor is a well-established theory with a wide range of applications, such as in radar, sonar, communications, or biomedicine. Today, we encounter an enormous growth of multi-sensor based detection. For example, in wireless sensor networks, one aims at making a global decision based on local decisions. The deployment of multiple sensors for signal detection improves system survivability, results in improved detection performance or in a shorter decision time to attain a preset performance level. In classical multi-sensor detection, local sensors transmit their raw data to a central processor where optimal detection is carried out. This has its drawbacks, including high communication bandwidth. Distributed processing has its advantage in that local sensors with low energy consumption carry out preliminary processing and communicate only the information relevant to the global objective, such as the decision on the presence or absence of a target in radar. This leads to low energy consumption, reduced communication bandwidth, and increases system reliability. The chapter on distributed signal detection provides a survey and most recent advances in distributed detection, such as distributed detection in the presence of dependent observations, using copula theory.

3.01.3.3 Diffusion adaptation over networks

Wireless sensor networks, which consist of spatially distributed autonomous sensors, are becoming fundamental to engineering practice. The sensors or agents in the network have the task to monitor physical or environmental conditions, such as temperature, pressure, or vibrations. Cooperatively, these sensors reach a global decision. The chapter on Diffusion Adaptation over Networks approaches the problem of global inference using adaptation and learning, which are important abilities of the collection of agents that are linked together through a connection topology. Adaptive networks are well suited to performing decentralized information processing and optimization in real time. One of the advantages of such networks is the continuous diffusion of information across the network that enables adaptation of performance in relation to changing data and network conditions. This overview article on diffusion strategies for adaptation and learning over networks provides fundamental principles and articulates the improved adaptation and learning performance of such networks relative to non-cooperative networks.

3.01.3.4 Non-stationary signal analysis—a time-frequency approach

Non-stationary signal analysis plays an important role in statistical signal processing. For example, in analyzing automotive engine signals, classical spectral analysis approaches fail as they do not capture the non-stationary nature of signals due to motion of the piston. Linear time-frequency approaches, such as the spectrogram, capture the non-stationary nature of signals whose spectral contents vary with time. This class of time-frequency representation has its advantages, but also its drawbacks. Quadratic time-frequency representations, although they lose the linearity property, have the advantage of providing a higher time-frequency concentration as compared to linear methods. Also, higher-order time-frequency representations have been proposed as they further improve the time-frequency concentration for a certain class of non-stationary signals. In this chapter, Ljubiśa Stanković et al. provide an overview of state-of-the-art methods for non-stationary signal analysis and their applications to real-life problems, including inverse synthetic aperture radar.

3.01.3.5 Bayesian computational methods in signal processing

There are two schools of thoughts in statistical inference, i.e., the frequentist and the Bayesian approaches. This chapter deals with Bayesian inference. The author first illustrates Bayesian inference through the linear Gaussian model. This model makes many of the required calculations straightforward and analytically computable. He then considers the practically more relevant problem where there are intractable elements in the models. This problem can only be solved numerically. There is a wide range of computational tools available for solving complex Bayesian inference problems, ranging from simple Laplace approximations to posterior densities, through variational Bayes’ methods to highly sophisticated Monte Carlo schemes. The author gives a flavor of some of the techniques available today, starting with one of the simplest and most effective: the Expectation-Maximization algorithm. He then describes Markov Chain Monte Carlo (MCMC) methods, which have gained much importance in solving complicated problems, and concludes with the emerging topic of particle filtering in statistical signal processing.

3.01.3.6 Model order selection

A problem encountered again and again in statistical signal processing is model selection. The signal processing practitioners require a simple, but effective means to deciding on a model within a family of models, given measurements. This problem is known as model selection. There exist a wealth of methods available to solving this problem. However, for a given set of data different model selection procedures give different results. For this reason, model selection and model order selection are still an active area of research in statistical science as well as statistical signal processing. The authors of this chapter describe the basic principles, challenges, and the complexity of the model selection problem. They treat statistical inference-based methods in detail, and those techniques as well as their variants, widely used by engineers. The chapter concludes with a practical engineering example in determining the dimension of the signal subspace, a problem encountered in sensor array processing and harmonic retrieval.

3.01.3.7 Performance analysis and bounds

Bounds provide fundamental limits on estimation given some assumptions on the probability laws and a model for the parameters of interest. In his chapter, Brian Sadler considers performance analysis of estimators, as well as bounds on estimation performance. He introduces key ideas and avenues for analysis, referring to the literature for detailed examples. He seeks to provide a description of the analytical procedure, as well as to provide some insight, intuition, and guidelines on applicability and results.

3.01.3.8 Geolocation

Geolocation denotes the position of an object in a geographical context. As Frederik describes it, geolocation is characterized by the four Ms, which are the Measurements used, the Map, the Motion model used for describing the motion of the object, and the filtering Method. He describes a general framework for geolocation based on the particle filter. He generalizes the concept of fingerprinting for describing the procedure of fitting measurements (along a trajectory) to the map. Several examples based on real data are used to illustrate various combinations of sensors and maps for geolocation. He finally discusses different ways as to show how the tedious mapping steps can be automated.

3.01.4 Suggested further reading

Some readers will be inspired by the collection of chapters in this section and would want to deepen further their knowledge and apply some of the techniques to their own signal processing problems. Those readers are encouraged to consult textbooks, such as the ones provided in the list of references in this introduction or the ones specially tailored to the subject contained in this section for further reading. I also hope that the readers will find inspirations in other tutorials on the above topics published earlier in the IEEE Signal Processing Magazine or The Proceedings of the IEEE.

References

1. Lehmann EL. Theory of Point Estimation. Wadsworth & Brooks/Cole Advanced Books & Software 1983.

2. Lehmann EL. Testing Statistical Hypotheses. New York: John Wiley & Sons, Inc.; 1959.

3. Neyman J, Pearson ES. On the problem of the most efficient tests of statistical hypotheses. Philos Trans R Soc., Ser A. 1933;231:289–337.

4. Anderson TW. An Introduction to Multivariate Statistical Analysis. New York: John Wiley & Sons, Inc.; 1958.

5. Cooley JW, Tukey JW. An algorithm for the machine calculation of complex Fourier series. Math Comput. 1965;19(90):297–301.

6. Kay SM. In: Prentice-Hall 1993; Fundamentals of Statistical Signal Processing, Estimation Theory. vol. I.

7. Kay SM. In: Prentice-Hall 1998; Fundamentals of Statistical Signal Processing, Detection Theory. vol. II.

8. Scharf LL. Statistical Signal Processing: Detection, Estimation, and Time Series Analysis. Boston: Addison Wesley; 1991.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.118.120.204