Zum Hauptinhalt springen
Dekorationsartikel gehören nicht zum Leistungsumfang.
Hands-On Mathematics for Deep Learning
Build a solid mathematical foundation for training efficient deep neural networks
Taschenbuch von Jay Dawani
Sprache: Englisch

49,30 €*

inkl. MwSt.

Versandkostenfrei per Post / DHL

Aktuell nicht verfügbar

Kategorien:
Beschreibung
A comprehensive guide to getting well-versed with the mathematical techniques for building modern deep learning architectures
Key Features

Understand linear algebra, calculus, gradient algorithms, and other concepts essential for training deep neural networks

Learn the mathematical concepts needed to understand how deep learning models function

Use deep learning for solving problems related to vision, image, text, and sequence applications

Book Description

Most programmers and data scientists struggle with mathematics, having either overlooked or forgotten core mathematical concepts. This book uses Python libraries to help you understand the math required to build deep learning (DL) models.

You'll begin by learning about core mathematical and modern computational techniques used to design and implement DL algorithms. This book will cover essential topics, such as linear algebra, eigenvalues and eigenvectors, the singular value decomposition concept, and gradient algorithms, to help you understand how to train deep neural networks. Later chapters focus on important neural networks, such as the linear neural network and multilayer perceptrons, with a primary focus on helping you learn how each model works. As you advance, you will delve into the math used for regularization, multi-layered DL, forward propagation, optimization, and backpropagation techniques to understand what it takes to build full-fledged DL models. Finally, you'll explore CNN, recurrent neural network (RNN), and GAN models and their application.

By the end of this book, you'll have built a strong foundation in neural networks and DL mathematical concepts, which will help you to confidently research and build custom models in DL.

What you will learn

Understand the key mathematical concepts for building neural network models

Discover core multivariable calculus concepts

Improve the performance of deep learning models using optimization techniques

Cover optimization algorithms, from basic stochastic gradient descent (SGD) to the advanced Adam optimizer

Understand computational graphs and their importance in DL

Explore the backpropagation algorithm to reduce output error

Cover DL algorithms such as convolutional neural networks (CNNs), sequence models, and generative adversarial networks (GANs)

Who this book is for

This book is for data scientists, machine learning developers, aspiring deep learning developers, or anyone who wants to understand the foundation of deep learning by learning the math behind it. Working knowledge of the Python programming language and machine learning basics is required.
A comprehensive guide to getting well-versed with the mathematical techniques for building modern deep learning architectures
Key Features

Understand linear algebra, calculus, gradient algorithms, and other concepts essential for training deep neural networks

Learn the mathematical concepts needed to understand how deep learning models function

Use deep learning for solving problems related to vision, image, text, and sequence applications

Book Description

Most programmers and data scientists struggle with mathematics, having either overlooked or forgotten core mathematical concepts. This book uses Python libraries to help you understand the math required to build deep learning (DL) models.

You'll begin by learning about core mathematical and modern computational techniques used to design and implement DL algorithms. This book will cover essential topics, such as linear algebra, eigenvalues and eigenvectors, the singular value decomposition concept, and gradient algorithms, to help you understand how to train deep neural networks. Later chapters focus on important neural networks, such as the linear neural network and multilayer perceptrons, with a primary focus on helping you learn how each model works. As you advance, you will delve into the math used for regularization, multi-layered DL, forward propagation, optimization, and backpropagation techniques to understand what it takes to build full-fledged DL models. Finally, you'll explore CNN, recurrent neural network (RNN), and GAN models and their application.

By the end of this book, you'll have built a strong foundation in neural networks and DL mathematical concepts, which will help you to confidently research and build custom models in DL.

What you will learn

Understand the key mathematical concepts for building neural network models

Discover core multivariable calculus concepts

Improve the performance of deep learning models using optimization techniques

Cover optimization algorithms, from basic stochastic gradient descent (SGD) to the advanced Adam optimizer

Understand computational graphs and their importance in DL

Explore the backpropagation algorithm to reduce output error

Cover DL algorithms such as convolutional neural networks (CNNs), sequence models, and generative adversarial networks (GANs)

Who this book is for

This book is for data scientists, machine learning developers, aspiring deep learning developers, or anyone who wants to understand the foundation of deep learning by learning the math behind it. Working knowledge of the Python programming language and machine learning basics is required.
Über den Autor
Jay Dawani is a former professional swimmer turned mathematician and computer scientist. He is also a Forbes 30 Under 30 Fellow. At present, he is the Director of Artificial Intelligence at Geometric Energy Corporation (NATO CAGE) and the CEO of Lemurian Labs - a startup he founded that is developing the next generation of autonomy, intelligent process automation, and driver intelligence. Previously he has also been the technology and R&D advisor to Spacebit Capital. He has spent the last three years researching at the frontiers of AI with a focus on reinforcement learning, open-ended learning, deep learning, quantum machine learning, human-machine interaction, multi-agent and complex systems, and artificial general intelligence.
Details
Erscheinungsjahr: 2020
Genre: Importe, Informatik
Rubrik: Naturwissenschaften & Technik
Medium: Taschenbuch
ISBN-13: 9781838647292
ISBN-10: 1838647295
Sprache: Englisch
Ausstattung / Beilage: Paperback
Einband: Kartoniert / Broschiert
Autor: Dawani, Jay
Hersteller: Packt Publishing
Verantwortliche Person für die EU: Books on Demand GmbH, In de Tarpen 42, D-22848 Norderstedt, info@bod.de
Maße: 235 x 191 x 20 mm
Von/Mit: Jay Dawani
Erscheinungsdatum: 12.06.2020
Gewicht: 0,679 kg
Artikel-ID: 118535384
Über den Autor
Jay Dawani is a former professional swimmer turned mathematician and computer scientist. He is also a Forbes 30 Under 30 Fellow. At present, he is the Director of Artificial Intelligence at Geometric Energy Corporation (NATO CAGE) and the CEO of Lemurian Labs - a startup he founded that is developing the next generation of autonomy, intelligent process automation, and driver intelligence. Previously he has also been the technology and R&D advisor to Spacebit Capital. He has spent the last three years researching at the frontiers of AI with a focus on reinforcement learning, open-ended learning, deep learning, quantum machine learning, human-machine interaction, multi-agent and complex systems, and artificial general intelligence.
Details
Erscheinungsjahr: 2020
Genre: Importe, Informatik
Rubrik: Naturwissenschaften & Technik
Medium: Taschenbuch
ISBN-13: 9781838647292
ISBN-10: 1838647295
Sprache: Englisch
Ausstattung / Beilage: Paperback
Einband: Kartoniert / Broschiert
Autor: Dawani, Jay
Hersteller: Packt Publishing
Verantwortliche Person für die EU: Books on Demand GmbH, In de Tarpen 42, D-22848 Norderstedt, info@bod.de
Maße: 235 x 191 x 20 mm
Von/Mit: Jay Dawani
Erscheinungsdatum: 12.06.2020
Gewicht: 0,679 kg
Artikel-ID: 118535384
Sicherheitshinweis

Ähnliche Produkte

Ähnliche Produkte