Danilo P. Mandic, Vanessa Sue Lee Goh
Complex Valued Nonlinear Adaptive Filters
Noncircularity, Widely Linear and Neural Models
Danilo P. Mandic, Vanessa Sue Lee Goh
Complex Valued Nonlinear Adaptive Filters
Noncircularity, Widely Linear and Neural Models
- Gebundenes Buch
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung
This book examines nonlinear adaptive filtering in the complex domain, providing theoretical information and computational principles for optimizing applications in a range of fields. It begins with a full introduction to the topic, including background theory on standard complex statistics. The authors then go on to discuss the theoretical principles of complex valued nonlinear adaptive filters, and the concept of nonlinearity in general, before presenting learning algorithms for recurrent neural networks (RNN). The authors then use this fundamental information to cover more advanced topics…mehr
Andere Kunden interessierten sich auch für
- Saeid SaneiAdaptive Processing of Brain Signals159,99 €
- José C. PrincipeKernel Adaptive Filtering146,99 €
- Akira HiroseComplex-Valued Neural Networks156,99 €
- Parag KulkarniReinforcement & Systemic Mchin153,99 €
- Amit KonarEmotion Recognition162,99 €
- Akira HiroseComplex-Valued Neural Networks110,99 €
- Imbalanced Learning148,99 €
-
-
-
This book examines nonlinear adaptive filtering in the complex domain, providing theoretical information and computational principles for optimizing applications in a range of fields. It begins with a full introduction to the topic, including background theory on standard complex statistics. The authors then go on to discuss the theoretical principles of complex valued nonlinear adaptive filters, and the concept of nonlinearity in general, before presenting learning algorithms for recurrent neural networks (RNN). The authors then use this fundamental information to cover more advanced topics such as nonlinear adaptive prediction and forecasting through simulation, and a statistical framework for detecting the nature of complex random variables. The final chapter sets out potential applications using these techniques in order to illustrate the benefit of this approach.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Produktdetails
- Produktdetails
- Verlag: Wiley & Sons
- 1. Auflage
- Seitenzahl: 344
- Erscheinungstermin: 1. Juni 2009
- Englisch
- Abmessung: 249mm x 173mm x 25mm
- Gewicht: 740g
- ISBN-13: 9780470066355
- ISBN-10: 0470066350
- Artikelnr.: 25645809
- Verlag: Wiley & Sons
- 1. Auflage
- Seitenzahl: 344
- Erscheinungstermin: 1. Juni 2009
- Englisch
- Abmessung: 249mm x 173mm x 25mm
- Gewicht: 740g
- ISBN-13: 9780470066355
- ISBN-10: 0470066350
- Artikelnr.: 25645809
Danilo Mandic, Department of Electrical and Electronic Engineering, Imperial College London, London Dr Mandic is currently a Reader in Signal Processing at Imperial College, London. He is an experienced author, having written the book Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and Stability (Wiley, 2001), and more than 150 published journal and conference papers on signal and image processing. His research interests include nonlinear adaptive signal processing, multimodal signal processing and nonlinear dynamics, and he is an Associate Editor for the journals IEEE Transactions on Circuits and Systems and the International Journal of Mathematical Modelling and Algorithms. Dr Mandic is also on the IEEE Technical Committee on Machine Learning for Signal Processing, and he has produced award winning papers and products resulting from his collaboration with industry. Su-Lee Goh, Royal Dutch Shell plc, Holland Dr Goh is currently working as a Reservoir Imaging Geophysicist at Shell in Holland. Her research interests include nonlinear signal processing, adaptive filters, complex-valued analysis, and imaging and forecasting. She received her PhD in nonlinear adaptive signal processing from Imperial College, London and is a member of the IEEE and the Society of Exploration Geophysicists.
Preface xiii Acknowledgements xvii 1 The Magic of Complex Numbers 1 1.1 History of Complex Numbers 2 1.2 History of Mathematical Notation 8 1.3 Development of Complex Valued Adaptive Signal Processing 9 2 Why Signal Processing in the Complex Domain? 13 2.1 Some Examples of Complex Valued Signal Processing 13 2.2 Modelling in C is Not Only Convenient But Also Natural 19 2.3 Why Complex Modelling of Real Valued Processes? 20 2.4 Exploiting the Phase Information 23 2.5 Other Applications of Complex Domain Processing of Real Valued Signals 26 2.6 Additional Benefits of Complex Domain Processing 29 3 Adaptive Filtering Architectures 33 3.1 Linear and Nonlinear Stochastic Models 34 3.2 Linear and Nonlinear Adaptive Filtering Architectures 35 3.3 State Space Representation and Canonical Forms 39 4 Complex Nonlinear Activation Functions 43 4.1 Properties of Complex Functions 43 4.2 Universal Function Approximation 46 4.3 Nonlinear Activation Functions for Complex Neural Networks 48 4.4 Generalised Splitting Activation Functions (GSAF) 53 4.5 Summary: Choice of the Complex Activation Function 54 5 Elements of CR Calculus 55 5.1 Continuous Complex Functions 56 5.2 The Cauchy-Riemann Equations 56 5.3 Generalised Derivatives of Functions of Complex Variable 57 5.4 CR-derivatives of Cost Functions 62 6 Complex Valued Adaptive Filters 69 6.1 Adaptive Filtering Configurations 70 6.2 The Complex Least Mean Square Algorithm 73 6.3 Nonlinear Feedforward Complex Adaptive Filters 80 6.4 Normalisation of Learning Algorithms 85 6.5 Performance of Feedforward Nonlinear Adaptive Filters 87 6.6 Summary: Choice of a Nonlinear Adaptive Filter 89 7 Adaptive Filters with Feedback 91 7.1 Training of IIR Adaptive Filters 92 7.2 Nonlinear Adaptive IIR Filters: Recurrent Perceptron 97 7.3 Training of Recurrent Neural Networks 99 7.4 Simulation Examples 102 8 Filters with an Adaptive Stepsize 107 8.1 Benveniste Type Variable Stepsize Algorithms 108 8.2 Complex Valued GNGD Algorithms 110 8.3 Simulation Examples 113 9 Filters with an Adaptive Amplitude of Nonlinearity 119 9.1 Dynamical Range Reduction 119 9.2 FIR Adaptive Filters with an Adaptive Nonlinearity 121 9.3 Recurrent Neural Networks with Trainable Amplitude of Activation Functions 122 9.4 Simulation Results 124 10 Data-reusing Algorithms for Complex Valued Adaptive Filters 129 10.1 The Data-reusing Complex Valued Least Mean Square (DRCLMS) Algorithm 129 10.2 Data-reusing Complex Nonlinear Adaptive Filters 131 10.3 Data-reusing Algorithms for Complex RNNs 134 11 Complex Mappings and M
obius Transformations 137 11.1 Matrix Representation of a Complex Number 137 11.2 The M
obius Transformation 140 11.3 Activation Functions and M
obius Transformations 142 11.4 All-pass Systems as M
obius Transformations 146 11.5 Fractional Delay Filters 147 12 Augmented Complex Statistics 151 12.1 Complex Random Variables (CRV) 152 12.2 Complex Circular Random Variables 158 12.3 Complex Signals 159 12.4 Second-order Characterisation of Complex Signals 161 13 Widely Linear Estimation and Augmented CLMS (ACLMS) 169 13.1 Minimum Mean Square Error (MMSE) Estimation in C 169 13.2 Complex White Noise 172 13.3 Autoregressive Modelling in C 173 13.4 The Augmented Complex LMS (ACLMS) Algorithm 175 13.5 Adaptive Prediction Based on ACLMS 178 14 Duality Between Complex Valued and Real Valued Filters 183 14.1 A Dual Channel Real Valued Adaptive Filter 184 14.2 Duality Between Real and Complex Valued Filters 186 14.3 Simulations 188 15 Widely Linear Filters with Feedback 191 15.1 The Widely Linear ARMA (WL-ARMA) Model 192 15.2 Widely Linear Adaptive Filters with Feedback 192 15.4 The Augmented Kalman Filter Algorithm for RNNs 198 15.5 Augmented Complex Unscented Kalman Filter (ACUKF) 200 15.6 Simulation Examples 203 16 Collaborative Adaptive Filtering 207 16.1 Parametric Signal Modality Characterisation 207 16.2 Standard Hybrid Filtering in R 209 16.3 Tracking the Linear/Nonlinear Nature of Complex Valued Signals 210 16.4 Split vs Fully Complex Signal Natures 214 16.5 Online Assessment of the Nature of Wind Signal 216 16.6 Collaborative Filters for General Complex Signals 217 17 Adaptive Filtering Based on EMD 221 17.1 The Empirical Mode Decomposition Algorithm 222 17.2 Complex Extensions of Empirical Mode Decomposition 226 17.3 Addressing the Problem of Uniqueness 230 17.4 Applications of Complex Extensions of EMD 230 18 Validation of Complex Representations - Is This Worthwhile? 233 18.1 Signal Modality Characterisation in R 234 18.2 Testing for the Validity of Complex Representation 239 18.3 Quantifying Benefits of Complex Valued Representation 243 Appendix A: Some Distinctive Properties of Calculus in C 245 Appendix B: Liouville's Theorem 251 Appendix C: Hypercomplex and Clifford Algebras 253 Appendix D: Real Valued Activation Functions 257 Appendix E: Elementary Transcendental Functions (ETF) 259 Appendix F: The O Notation and Standard Vector and Matrix Differentiation 263 Appendix G: Notions From Learning Theory 265 Appendix H: Notions from Approximation Theory 269 Appendix I: Terminology Used in the Field of Neural Networks 273 Appendix J: Complex Valued Pipelined Recurrent Neural Network (CPRNN) 275 Appendix K: Gradient Adaptive Step Size (GASS) Algorithms in R 279 Appendix L: Derivation of Partial Derivatives from Chapter 8 283 Appendix M: A Posteriori Learning 287 Appendix N: Notions from Stability Theory 291 Appendix O: Linear Relaxation 293 Appendix P: Contraction Mappings, Fixed Point Iteration and Fractals 299 References 309 Index 321
obius Transformations 137 11.1 Matrix Representation of a Complex Number 137 11.2 The M
obius Transformation 140 11.3 Activation Functions and M
obius Transformations 142 11.4 All-pass Systems as M
obius Transformations 146 11.5 Fractional Delay Filters 147 12 Augmented Complex Statistics 151 12.1 Complex Random Variables (CRV) 152 12.2 Complex Circular Random Variables 158 12.3 Complex Signals 159 12.4 Second-order Characterisation of Complex Signals 161 13 Widely Linear Estimation and Augmented CLMS (ACLMS) 169 13.1 Minimum Mean Square Error (MMSE) Estimation in C 169 13.2 Complex White Noise 172 13.3 Autoregressive Modelling in C 173 13.4 The Augmented Complex LMS (ACLMS) Algorithm 175 13.5 Adaptive Prediction Based on ACLMS 178 14 Duality Between Complex Valued and Real Valued Filters 183 14.1 A Dual Channel Real Valued Adaptive Filter 184 14.2 Duality Between Real and Complex Valued Filters 186 14.3 Simulations 188 15 Widely Linear Filters with Feedback 191 15.1 The Widely Linear ARMA (WL-ARMA) Model 192 15.2 Widely Linear Adaptive Filters with Feedback 192 15.4 The Augmented Kalman Filter Algorithm for RNNs 198 15.5 Augmented Complex Unscented Kalman Filter (ACUKF) 200 15.6 Simulation Examples 203 16 Collaborative Adaptive Filtering 207 16.1 Parametric Signal Modality Characterisation 207 16.2 Standard Hybrid Filtering in R 209 16.3 Tracking the Linear/Nonlinear Nature of Complex Valued Signals 210 16.4 Split vs Fully Complex Signal Natures 214 16.5 Online Assessment of the Nature of Wind Signal 216 16.6 Collaborative Filters for General Complex Signals 217 17 Adaptive Filtering Based on EMD 221 17.1 The Empirical Mode Decomposition Algorithm 222 17.2 Complex Extensions of Empirical Mode Decomposition 226 17.3 Addressing the Problem of Uniqueness 230 17.4 Applications of Complex Extensions of EMD 230 18 Validation of Complex Representations - Is This Worthwhile? 233 18.1 Signal Modality Characterisation in R 234 18.2 Testing for the Validity of Complex Representation 239 18.3 Quantifying Benefits of Complex Valued Representation 243 Appendix A: Some Distinctive Properties of Calculus in C 245 Appendix B: Liouville's Theorem 251 Appendix C: Hypercomplex and Clifford Algebras 253 Appendix D: Real Valued Activation Functions 257 Appendix E: Elementary Transcendental Functions (ETF) 259 Appendix F: The O Notation and Standard Vector and Matrix Differentiation 263 Appendix G: Notions From Learning Theory 265 Appendix H: Notions from Approximation Theory 269 Appendix I: Terminology Used in the Field of Neural Networks 273 Appendix J: Complex Valued Pipelined Recurrent Neural Network (CPRNN) 275 Appendix K: Gradient Adaptive Step Size (GASS) Algorithms in R 279 Appendix L: Derivation of Partial Derivatives from Chapter 8 283 Appendix M: A Posteriori Learning 287 Appendix N: Notions from Stability Theory 291 Appendix O: Linear Relaxation 293 Appendix P: Contraction Mappings, Fixed Point Iteration and Fractals 299 References 309 Index 321
Preface xiii Acknowledgements xvii 1 The Magic of Complex Numbers 1 1.1 History of Complex Numbers 2 1.2 History of Mathematical Notation 8 1.3 Development of Complex Valued Adaptive Signal Processing 9 2 Why Signal Processing in the Complex Domain? 13 2.1 Some Examples of Complex Valued Signal Processing 13 2.2 Modelling in C is Not Only Convenient But Also Natural 19 2.3 Why Complex Modelling of Real Valued Processes? 20 2.4 Exploiting the Phase Information 23 2.5 Other Applications of Complex Domain Processing of Real Valued Signals 26 2.6 Additional Benefits of Complex Domain Processing 29 3 Adaptive Filtering Architectures 33 3.1 Linear and Nonlinear Stochastic Models 34 3.2 Linear and Nonlinear Adaptive Filtering Architectures 35 3.3 State Space Representation and Canonical Forms 39 4 Complex Nonlinear Activation Functions 43 4.1 Properties of Complex Functions 43 4.2 Universal Function Approximation 46 4.3 Nonlinear Activation Functions for Complex Neural Networks 48 4.4 Generalised Splitting Activation Functions (GSAF) 53 4.5 Summary: Choice of the Complex Activation Function 54 5 Elements of CR Calculus 55 5.1 Continuous Complex Functions 56 5.2 The Cauchy-Riemann Equations 56 5.3 Generalised Derivatives of Functions of Complex Variable 57 5.4 CR-derivatives of Cost Functions 62 6 Complex Valued Adaptive Filters 69 6.1 Adaptive Filtering Configurations 70 6.2 The Complex Least Mean Square Algorithm 73 6.3 Nonlinear Feedforward Complex Adaptive Filters 80 6.4 Normalisation of Learning Algorithms 85 6.5 Performance of Feedforward Nonlinear Adaptive Filters 87 6.6 Summary: Choice of a Nonlinear Adaptive Filter 89 7 Adaptive Filters with Feedback 91 7.1 Training of IIR Adaptive Filters 92 7.2 Nonlinear Adaptive IIR Filters: Recurrent Perceptron 97 7.3 Training of Recurrent Neural Networks 99 7.4 Simulation Examples 102 8 Filters with an Adaptive Stepsize 107 8.1 Benveniste Type Variable Stepsize Algorithms 108 8.2 Complex Valued GNGD Algorithms 110 8.3 Simulation Examples 113 9 Filters with an Adaptive Amplitude of Nonlinearity 119 9.1 Dynamical Range Reduction 119 9.2 FIR Adaptive Filters with an Adaptive Nonlinearity 121 9.3 Recurrent Neural Networks with Trainable Amplitude of Activation Functions 122 9.4 Simulation Results 124 10 Data-reusing Algorithms for Complex Valued Adaptive Filters 129 10.1 The Data-reusing Complex Valued Least Mean Square (DRCLMS) Algorithm 129 10.2 Data-reusing Complex Nonlinear Adaptive Filters 131 10.3 Data-reusing Algorithms for Complex RNNs 134 11 Complex Mappings and M
obius Transformations 137 11.1 Matrix Representation of a Complex Number 137 11.2 The M
obius Transformation 140 11.3 Activation Functions and M
obius Transformations 142 11.4 All-pass Systems as M
obius Transformations 146 11.5 Fractional Delay Filters 147 12 Augmented Complex Statistics 151 12.1 Complex Random Variables (CRV) 152 12.2 Complex Circular Random Variables 158 12.3 Complex Signals 159 12.4 Second-order Characterisation of Complex Signals 161 13 Widely Linear Estimation and Augmented CLMS (ACLMS) 169 13.1 Minimum Mean Square Error (MMSE) Estimation in C 169 13.2 Complex White Noise 172 13.3 Autoregressive Modelling in C 173 13.4 The Augmented Complex LMS (ACLMS) Algorithm 175 13.5 Adaptive Prediction Based on ACLMS 178 14 Duality Between Complex Valued and Real Valued Filters 183 14.1 A Dual Channel Real Valued Adaptive Filter 184 14.2 Duality Between Real and Complex Valued Filters 186 14.3 Simulations 188 15 Widely Linear Filters with Feedback 191 15.1 The Widely Linear ARMA (WL-ARMA) Model 192 15.2 Widely Linear Adaptive Filters with Feedback 192 15.4 The Augmented Kalman Filter Algorithm for RNNs 198 15.5 Augmented Complex Unscented Kalman Filter (ACUKF) 200 15.6 Simulation Examples 203 16 Collaborative Adaptive Filtering 207 16.1 Parametric Signal Modality Characterisation 207 16.2 Standard Hybrid Filtering in R 209 16.3 Tracking the Linear/Nonlinear Nature of Complex Valued Signals 210 16.4 Split vs Fully Complex Signal Natures 214 16.5 Online Assessment of the Nature of Wind Signal 216 16.6 Collaborative Filters for General Complex Signals 217 17 Adaptive Filtering Based on EMD 221 17.1 The Empirical Mode Decomposition Algorithm 222 17.2 Complex Extensions of Empirical Mode Decomposition 226 17.3 Addressing the Problem of Uniqueness 230 17.4 Applications of Complex Extensions of EMD 230 18 Validation of Complex Representations - Is This Worthwhile? 233 18.1 Signal Modality Characterisation in R 234 18.2 Testing for the Validity of Complex Representation 239 18.3 Quantifying Benefits of Complex Valued Representation 243 Appendix A: Some Distinctive Properties of Calculus in C 245 Appendix B: Liouville's Theorem 251 Appendix C: Hypercomplex and Clifford Algebras 253 Appendix D: Real Valued Activation Functions 257 Appendix E: Elementary Transcendental Functions (ETF) 259 Appendix F: The O Notation and Standard Vector and Matrix Differentiation 263 Appendix G: Notions From Learning Theory 265 Appendix H: Notions from Approximation Theory 269 Appendix I: Terminology Used in the Field of Neural Networks 273 Appendix J: Complex Valued Pipelined Recurrent Neural Network (CPRNN) 275 Appendix K: Gradient Adaptive Step Size (GASS) Algorithms in R 279 Appendix L: Derivation of Partial Derivatives from Chapter 8 283 Appendix M: A Posteriori Learning 287 Appendix N: Notions from Stability Theory 291 Appendix O: Linear Relaxation 293 Appendix P: Contraction Mappings, Fixed Point Iteration and Fractals 299 References 309 Index 321
obius Transformations 137 11.1 Matrix Representation of a Complex Number 137 11.2 The M
obius Transformation 140 11.3 Activation Functions and M
obius Transformations 142 11.4 All-pass Systems as M
obius Transformations 146 11.5 Fractional Delay Filters 147 12 Augmented Complex Statistics 151 12.1 Complex Random Variables (CRV) 152 12.2 Complex Circular Random Variables 158 12.3 Complex Signals 159 12.4 Second-order Characterisation of Complex Signals 161 13 Widely Linear Estimation and Augmented CLMS (ACLMS) 169 13.1 Minimum Mean Square Error (MMSE) Estimation in C 169 13.2 Complex White Noise 172 13.3 Autoregressive Modelling in C 173 13.4 The Augmented Complex LMS (ACLMS) Algorithm 175 13.5 Adaptive Prediction Based on ACLMS 178 14 Duality Between Complex Valued and Real Valued Filters 183 14.1 A Dual Channel Real Valued Adaptive Filter 184 14.2 Duality Between Real and Complex Valued Filters 186 14.3 Simulations 188 15 Widely Linear Filters with Feedback 191 15.1 The Widely Linear ARMA (WL-ARMA) Model 192 15.2 Widely Linear Adaptive Filters with Feedback 192 15.4 The Augmented Kalman Filter Algorithm for RNNs 198 15.5 Augmented Complex Unscented Kalman Filter (ACUKF) 200 15.6 Simulation Examples 203 16 Collaborative Adaptive Filtering 207 16.1 Parametric Signal Modality Characterisation 207 16.2 Standard Hybrid Filtering in R 209 16.3 Tracking the Linear/Nonlinear Nature of Complex Valued Signals 210 16.4 Split vs Fully Complex Signal Natures 214 16.5 Online Assessment of the Nature of Wind Signal 216 16.6 Collaborative Filters for General Complex Signals 217 17 Adaptive Filtering Based on EMD 221 17.1 The Empirical Mode Decomposition Algorithm 222 17.2 Complex Extensions of Empirical Mode Decomposition 226 17.3 Addressing the Problem of Uniqueness 230 17.4 Applications of Complex Extensions of EMD 230 18 Validation of Complex Representations - Is This Worthwhile? 233 18.1 Signal Modality Characterisation in R 234 18.2 Testing for the Validity of Complex Representation 239 18.3 Quantifying Benefits of Complex Valued Representation 243 Appendix A: Some Distinctive Properties of Calculus in C 245 Appendix B: Liouville's Theorem 251 Appendix C: Hypercomplex and Clifford Algebras 253 Appendix D: Real Valued Activation Functions 257 Appendix E: Elementary Transcendental Functions (ETF) 259 Appendix F: The O Notation and Standard Vector and Matrix Differentiation 263 Appendix G: Notions From Learning Theory 265 Appendix H: Notions from Approximation Theory 269 Appendix I: Terminology Used in the Field of Neural Networks 273 Appendix J: Complex Valued Pipelined Recurrent Neural Network (CPRNN) 275 Appendix K: Gradient Adaptive Step Size (GASS) Algorithms in R 279 Appendix L: Derivation of Partial Derivatives from Chapter 8 283 Appendix M: A Posteriori Learning 287 Appendix N: Notions from Stability Theory 291 Appendix O: Linear Relaxation 293 Appendix P: Contraction Mappings, Fixed Point Iteration and Fractals 299 References 309 Index 321