神经计算原理

出版时间:2003-7  出版社:机械工业出版社  作者:Fredric M.Ham,Ivica Kostanic  页数:642  
Tag标签:无  

内容概要

本书是一部优秀的教材,着重讲述人工神经网络基本原理以及如何运用各种神经计算技术来解决科学和工程领域中的现实问题:模式识别、最优化、事件分类、非线性系统的控制和识别以及统计分析等。算法——大多数训练算法都用上下框线框出,便于读者查找 MATLAB函数——一些训练算法有一个附带的MATLAB函数实现(在文中用黑体字显示)。代码部分相对简短,仅用几分钟就可以输入MATLAB MATLAB Toolbox——书中大量使用MATLAB的Neural Network Toolbox来举例说明某些神经计算概念 Web站点——登录本书的Web站点http://www.mhhe.com/engcs/electrical/ham可获取最新、最全面的信息示例——在大多数章节中都给出了详尽的示例,阐释重要的神经计算概念 习题集——每章最后都给出大量应用神经计算技术的习题。一些习题需要使用MATLAB和MATLAB的Neural Network Toolbox。在某些情况下,还提供了MATLAB函数代码附录——附录A全面介绍了神经计算的数学基础。

书籍目录

About the AuthorsPrefaceAcknowledgmentsList of Important Symbols and OperatorsList of Important AbbreviationsPARTI Fundamental Neurocomputing Concepts andSelected Neural Network Architectures andLearning Rules1 Introduction to Neurocomputing  1.1 What Is Neurocomputing?  1.2 Historical Notes  1.3 Neurocomputing and Neuroscience  1.4 Classification of Neural Networks  1.5 Guide to the Book  References2 Fundamental Neurocomputing Concepts  2.1 Introduction  2.2 Basic Models of Artificial Neurons  2.3 Basic Activation Functions  2.4 Hopfield Model of the Artificial Neuron  2.5 Adaline and Madaline  2.6 Simple Perceptron  2.7 Feedforward Multilayer Perceptron  2.8 Overview of Basic Learning Rules for a Single Neuron  2.9 Data Preprocessing  Problems  References3 Mapping Networks  3.1 Introduction  3.2 Associative Memory Networks  3.3 Backpropagation Learning Algorithms  3.4 Accelerated Learning Backpropagation Algorithms  3.5 Counterpropagation  3.6 Radial Basis Function Neural Networks  Problems  References4 Self-Organizing Networks  4.1 Introduction  4.2 Kohonen Self-Organizing Map  4.3 Learning Vector Quantization  4.4 Adaptive Resonance Theory (ART) Neural Networks  Problems  References5 Recurrent Networks and Temporal Feedforward Networks  5.1 Introduction  5.2 Overview of Recurrent Neural Networks  5.3 Hopfield Associative Memory  5.4 Simulated Annealing  5.5 Boltzmann Machine  5.6 Overview of Temporal Feedforward Networks  5.7 Simple Recurrent Network  5.8 Time-Delay Neural Networks  5.9 Distributed Time-Lagged Feedforward Neural  Networks  Problems  References  PART II Applications of Neurocomputing6 Neural Networks for Optimization Problems  6.1 Introduction  6.2 Neural Networks for Linear Programming Problems  6.3 Neural Networks for Quadratic Programming  Problems  6.4 Neural Networks for Nonlinear Continuous  Constrained Optimization Problems  Problems  References  Solving Matrix Algebra Problems with Neural Networks  7.1 Introduction  7.2 Inverse and Pseudoinverse of a Matrix  7.3 LU Decomposition  7.4 QR Factorization  7.5 Schur Decomposition  7.6 Spectral Factorization - Eigenvalue Decomposition  (EVD) (Symmetric Eigenvalue Problem)  7.7 Neural Network Approach for the Symmetric  Eigenvalue Problem  7.8 Singular Value Decomposition  7.9 A Neurocomputing Approach for Solving the  Algebraic Lyapunov Equation  7.10 A Neurocomputing Approach for Solving the  Algebraic Riccati Equation  Problems  References8 Solution of Linear Algebraic Equations Using Neural  Networks  8.1 Introduction  8.2 Systems of Simultaneous Linear Algebraic Equations  8.3 Least-Squares Solution of Systems of Linear  Equations  8.4 A Least-Squares Neurocomputing Approach for  Solving Systems of Linear Equations  8.5 Conjugate Gradient Learning Rule for Solving  Systems of Linear Equations  8.6 A Generalized Robust Approach for Solving  Systems of Linear Equations Corrupted with Noise  8.7 Regularization Methods for Ill-Posed Problems with  Ill-Determined Numerical Rank  8.8 Matrix Splittings for Iterative Discrete-Time  Methods for Solving Linear Equations  8.9 Total Least-Squares problem  8.10 An L-Norm (Minimax) Neural Network for  Solving Linear Equations  8.11 An L1-Norm (Least-Absolute-Deviations) Neural  Network for Solving Linear Equations  Problems  References9 Statistical Methods Using Neural Networks  9.1 Introduction  9.2 Principal-Component Analysis  9.3 Learning Algorithms for Neural Network Adaptive  Estimation of Principal Components  9.4 Principal-Component Regression  9.5 Partial Least-Squares Regression  9.6 A Neural Network Approach for Partial  Least-Squares Regression  9.7 Robust PLSR: A Neural Network Approach  Problems  References10 Identification, Control, and Estimation Using Neural Networks  10.1 Introduction  10.2 Linear System Representation  10.3 Autoregressive Moving Average Models  10.4 Identification of Linear Systems with ARMA Models  10.5 Parametric System Identification of Linear Systems Using PLSNET  10.6 Nonlinear System Representation  10.7 Identification and Control of Nonlinear Dynamical Systems  10.8 Independent-Component Analysis: Blind Separation of Unknown Source Signals  10.9 Spectrum Estimation of Sinusoids in Additive Noise  10.10 Other Case StudiesProblemsReferencesApp A Mathematical Foundation for NeurocomputingA.1 IntroductionA.2 Linear AlgebraA.3 Principles of Multivariable AnalysisA.4 Lyapunov's Direct MethodA.5 Unconstrained Optimization MethodsA.6 Constrained Nonlinear ProgrammingA.7 Random Variables and Stochastic ProcessesA.8 Fuzzy Set TheoryA.9 Selected Trigonometric IdentitiesReferencesName IndexSubject Index

图书封面

图书标签Tags

评论、评分、阅读与下载


    神经计算原理 PDF格式下载


用户评论 (总计5条)

 
 

  •   正在看,有收获。英文版就是累点哈哈
  •   书讲得很通俗 适合入门
  •   看了中文版的,但觉得上边有得东西没写清楚。所以拿本英文原版看看。
  •   特别是英文版,很不好学
  •   强烈建议初学神经网络的同学研究研究,讲得很详细,推导过程详细,附录有用到的数学知识可查。不过纸张有点差,而且送过来还脏了。
 

250万本中文图书简介、评论、评分,PDF格式免费下载。 第一图书网 手机版

京ICP备13047387号-7