Introduction to deep learning: from logical calculus to artificial intelligence (Record no. 810019)

MARC details
000 -LEADER
fixed length control field 05189aam a2200265 4500
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION
fixed length control field 210211b2018 ||||| |||| 00| 0 eng d
020 ## - INTERNATIONAL STANDARD BOOK NUMBER
International Standard Book Number 9783319730035
082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER
Classification number 006.312
Item number S5I6
100 ## - MAIN ENTRY--PERSONAL NAME
Personal name Skansi, Sandro
9 (RLIN) 2510185
245 ## - TITLE STATEMENT
Title Introduction to deep learning: from logical calculus to artificial intelligence
260 ## - PUBLICATION, DISTRIBUTION, ETC.
Name of publisher, distributor, etc. Springer International Publishing
Date of publication, distribution, etc. 2018
Place of publication, distribution, etc. Cham
300 ## - PHYSICAL DESCRIPTION
Extent xiii, 191 p.: ill.
Other physical details Includes bibliographical references and index
440 ## - SERIES STATEMENT/ADDED ENTRY--TITLE
Title Undergraduate topics in computer science
9 (RLIN) 395040
504 ## - BIBLIOGRAPHY, ETC. NOTE
Bibliography, etc. note Table of content<br/><br/>1 From Logic to Cognitive Science<br/>1.1 The Beginnings of Artificial Neural Networks<br/>1.2 The XOR Problem<br/>1.3 From Cognitive Science to Deep Learning<br/>1.4 Neural Networks in the General AI Landscape<br/>1.5 Philosophical and Cognitive Aspects<br/><br/>2 Mathematical and Computational Prerequisites<br/>2.1 Derivations and Function Minimization<br/>2.2 Vectors, Matrices and Linear Programming<br/>2.3 Probability Distributions<br/>2.4 Logic and Turing Machines<br/>2.5 Writing Python Code<br/>2.6 A Brief Overview of Python Programming<br/><br/>3 Machine Learning Basics<br/>3.1 Elementary Classification Problem<br/>3.2 Evaluating Classification Results<br/>3.3 A Simple Classifier: Naive Bayes<br/>3.4 A Simple Neural Network: Logistic Regression<br/>3.5 Introducing the MNIST Dataset<br/>3.6 Learning Without Labels: K-Means<br/>3.7 Learning Different Representations: PCA<br/>3.8 Learning Language: The Bag of Words Representation<br/><br/>4 Feedforward Neural Networks<br/>4.1 Basic Concepts and Terminology for Neural Networks<br/>4.2 Representing Network Components with Vectors and Matrices<br/>4.3 The Perceptron Rule<br/>4.4 The Delta Rule<br/>4.5 From the Logistic Neuron to Backpropagation<br/>4.6 Backpropagation<br/>4.7 A Complete Feedforward Neural Network<br/><br/>5 Modifications and Extensions to a Feed-Forward Neural Network<br/>5.1 The Idea of Regularization<br/>5.2 L1 and L2 Regularization<br/>5.3 Learning Rate, Momentum and Dropout<br/>5.4 Stochastic Gradient Descent and Online Learning<br/>5.5 Problems for Multiple Hidden Layers: Vanishing and Exploding Gradients<br/><br/>6 Convolutional Neural Networks<br/>6.1 A Third Visit to Logistic Regression<br/>6.2 Feature Maps and Pooling<br/>6.3 A Complete Convolutional Network<br/>6.4 Using a Convolutional Network to Classify Text<br/><br/>7 Recurrent Neural Networks<br/>7.1 Sequences of Unequal Length<br/>7.2 The Three Settings of Learning with Recurrent Neural Networks<br/>7.3 Adding Feedback Loops and Unfolding a Neural Network<br/>7.4 Elman Networks<br/>7.5 Long Short-Term Memory<br/>7.6 Using a Recurrent Neural Network for Predicting Following Words<br/><br/>8 Autoencoders<br/>8.1 Learning Representations<br/>8.2 Different Autoencoder Architectures<br/>8.3 Stacking Autoencoders<br/>8.4 Recreating the Cat Paper<br/><br/>9 Neural Language Models<br/>9.1 Word Embeddings and Word Analogies<br/>9.2 CBOW and Word2vec<br/>9.3 Word2vec in Code<br/>9.4 Walking Through the Word-Space: An Idea That Has Eluded Symbolic AI<br/><br/>10 An Overview of Different Neural Network Architectures<br/>10.1 Energy-Based Models<br/>10.2 Memory-Based Models<br/>10.3 The Kernel of General Connectionist Intelligence: The bAbI Dataset<br/><br/>11 Conclusion<br/>11.1 An Incomplete Overview of Open Research Questions<br/>11.2 The Spirit of Connectionism and Philosophical Ties<br/><br/>
520 ## - SUMMARY, ETC.
Summary, etc. This textbook presents a concise, accessible and engaging first introduction to deep learning, offering a wide range of connectionist models which represent the current state-of-the-art. The text explores the most popular algorithms and architectures in a simple and intuitive style, explaining the mathematical derivations in a step-by-step manner. The content coverage includes convolutional networks, LSTMs, Word2vec, RBMs, DBNs, neural Turing machines, memory networks and autoencoders. Numerous examples in working Python code are provided throughout the book, and the code is also supplied separately at an accompanying website.<br/>Topics and features: introduces the fundamentals of machine learning, and the mathematical and computational prerequisites for deep learning; discusses feed-forward neural networks, and explores the modifications to these which can be applied to any neural network; examines convolutional neural networks, and the recurrent connections to a feed-forward neural network; describes the notion of distributed representations, the concept of the autoencoder, and the ideas behind language processing with deep learning; presents a brief history of artificial intelligence and neural networks, and reviews interesting open research problems in deep learning and connectionism.<br/>This clearly written and lively primer on deep learning is essential reading for graduate and advanced undergraduate students of computer science, cognitive science and mathematics, as well as fields such as linguistics, logic, philosophy, and psychology.<br/><br/>https://www.springer.com/gp/book/9783319730035
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element Neural networks (Computer science)
9 (RLIN) 2510186
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element Computers - Computer vision and pattern recognition
9 (RLIN) 2510187
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element Data mining
9 (RLIN) 2510188
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element Artificial intelligence - Mathematics
9 (RLIN) 2510189
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element Machine learning
9 (RLIN) 2510190
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element Coding theory and cryptology
9 (RLIN) 2510191
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element Mathematical modelling
9 (RLIN) 2510192
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element Image processing
9 (RLIN) 2510193
942 ## - ADDED ENTRY ELEMENTS (KOHA)
Source of classification or shelving scheme Dewey Decimal Classification
Koha item type Book
Holdings
Withdrawn status Lost status Source of classification or shelving scheme Damaged status Not for loan Collection code Home library Current library Shelving location Date acquired Source of acquisition Cost, normal purchase price Total Checkouts Full call number Barcode Date last seen Cost, replacement price Price effective from Koha item type
    Dewey Decimal Classification     Non-fiction Ahmedabad Ahmedabad General Stacks 01/03/2021 7 3347.26   006.312 S5I6 203186 01/03/2021 4184.07 01/03/2021 Book
            Jammu Jammu   27/08/2020 Segment Books 3666.68   006.3 SKA IIMLJ-3224 22/12/2021   22/12/2021 Book

Powered by Koha