Sign up
Forgot password?
FAQ: Login

Huang Lei. Normalization Techniques in Deep Learning

  • pdf file
  • size 2,80 MB
Huang Lei. Normalization Techniques in Deep Learning
Springer, 2022. — 117 p. — (Synthesis Lectures on Computer Vision). — ISBN: 978-3-031-14594-0.
This book presents and surveys normalization techniques with deep analysis of training deep neural networks. In addition, the author provides technical details in designing new normalization methods and network architectures tailored to specific tasks. Normalization methods can improve the training stability, optimization efficiency, and generalization ability of deep neural networks (DNNs) and have become basic components in most state-of-the-art DNN architectures. The author provides guidelines for elaborating, understanding, and applying normalization methods. This book is ideal for readers working on the development of novel deep learning algorithms and/or their applications to solve practical problems in computer vision and machine learning tasks. The book also serves as a resource for researchers, engineers, and students who are new to the field and need to understand and train DNNs.
Deep neural networks (DNNs) have been extensively used across a broad range of applications, including computer vision (CV), natural language processing (NLP), speech and audio processing, robotics, bioinformatics, etc. They are typically composed of stacked layers/modules, the transformation between which consists of a linear mapping with learnable parameters and a nonlinear activation function. While their deep and complex structure provides them with powerful representation capacity and appealing advantages in learning feature hierarchies, it also makes their training difficult. One notorious problem in training DNNs is the so-called activations (and gradients) vanishing or exploding, which is mainly caused by the compounded linear or nonlinear transformation in DNNs.
Motivation and Overview of Normalization in DNNs.
A General View of Normalizing Activations.
A Framework for Normalizing Activations as Functions.
Multi-mode and Combinational Normalization.
BN for More Robust Estimation.
Normalizing Weights.
Normalizing Gradients.
Analysis of Normalization.
Normalization in Task-Specific Applications.
Summary and Discussion.
  • Sign up or login using form at top of the page to download this file.
  • Sign up
Up