Amazon cover image
Image from Amazon.com

Neural networks and learning machines

By: Material type: TextTextPublication details: New Delhi Pearson 2009 2016Edition: 3rd edDescription: xxx, 906 p. xxiv, 918 pISBN:
  • 9789332570313
Subject(s): DDC classification:
  • 006.32 H2N3
Summary: Refocused, revised and renamed to reflect the duality of neural networks and learning machines, this edition recognizes that the subject matter is richer when these topics are studied together. Ideas drawn from neural networks and machine learning are hybridized to perform improved learning tasks beyond the capability of either independently.
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Collection Call number Status Date due Barcode Item holds
Book Book Nagpur General Stacks Non-fiction 006.32 H2N3-3 (Browse shelf(Opens below)) Available IIMN-001762
Book Book Nagpur General Stacks Non-fiction 006.32 H2N3-4 (Browse shelf(Opens below)) Available IIMN-001763
Book Book Nagpur General Stacks Non-fiction 006.32 H2N3-2 (Browse shelf(Opens below)) Available IIMN-001358
Book Book Nagpur General Stacks Non-fiction 006.32 H2N3-1 (Browse shelf(Opens below)) Available IIMN-001357
Total holds: 0

Table of Contents Preface x Background and Preview 000 1. The Filtering Problem 000 2. Linear Optimum Filters 000 3. Adaptive Filters 000 4. Linear Filter Structures 000 5. Approaches to the Development of Linear Adaptive Filters 000 6. Adaptive Beamforming 000 7. Four Classes of Applications 000 8. Historical Notes 000 Chapter 1 Stochastic Processes and Models 000 1.1 Partial Characterization of a Discrete-Time Stochastic Process 000 1.2 Mean Ergodic Theorem 000 1.3 Correlation Matrix 000 1.4 Correlation Matrix of Sine Wave Plus Noise 000 1.5 Stochastic Models 000 1.6 Wold Decomposition 000 1.7 Asymptotic Stationarity of an Autoregressive Process 000 1.8 Yule-Walker Equations 000 1.9 Computer Experiment: Autoregressive Process of Order Two 000 1.10 Selecting the Model Order 000 1.11 Complex Gaussian Processes 000 1.12 Power Spectral Density 000 1.13 Properties of Power Spectral Density 000 1.14 Transmission of a Stationary Process Through a Linear Filter 000 1.15 Cram¿r Spectral Representation for a Stationary Process 000 1.16 Power Spectrum Estimation 000 1.17 Other Statistical Characteristics of a Stochastic Process 000 1.18 Polyspectra 000 1.19 Spectral-Correlation Density 000 1.20 Summary 000 Problems 000 Chapter 10 Kalman Filters 000 10.1 Recursive Minimum Mean-Square Estimation for Scalar Random Variables 000 10.2 Statement of the Kalman Filtering Problem 000 10.3 The Innovations Process 000 10.4 Estimation of the State Using the Innovations Process 000 10.5 Filtering 000 10.6 Initial Conditions 000 10.7 Summary of the Kalman Filter 000 10.8 Kalman Filter as the Unifying Basis for RLS Filters 000 10.9 Variants of the Kalman Filter 000 10.10 The Extended Kalman Filter 000 10.11 Summary 000 Problems 000 Appendix A Complex Variables 000 A.1 Cauchy-Reimann Equations 000 A.2 Cauchy's Integral Formula 000 A.3 Laurent's Series 000 A.4 Singularities and Residues 000 A.5 Cauchy's Residue Theorem 000 A.6 Principle of the Argument 000 A.7 Inversion Integral for the z-Transform 000 A.8 Parseval's Theorem 000 Appendix B Differentiation with Respect to a Vector 000 B.1 Basic Definitions 000 B.2 Examples 000 B.3 Relation Between the Derivative with Respect to a Vector and the Gradient Vector 000 Appendix C Complex Wishart Distribution 000 G.1 Definition 000 G.2 The Chi-Square Distribution as a Special Case 000 G.3 Properties of the Complex Wishart Distribution 000 G.4 Expectation of the Inverse Correlation Matrix F21(n) 000 Glossary 000 Bibliography 000 Index 000

Refocused, revised and renamed to reflect the duality of neural networks and learning machines, this edition recognizes that the subject matter is richer when these topics are studied together. Ideas drawn from neural networks and machine learning are hybridized to perform improved learning tasks beyond the capability of either independently.

There are no comments on this title.

to post a comment.

Powered by Koha