Springer Cham, 2023. — 94 p. — (Synthesis Lectures on Engineering, Science, and Technology) — eBook ISBN: 978-3-031-38133-1.
Explores different design aspects associated with each number system and their effects on DNN performance.
Discusses the most efficient number systems for DNNs hardware realization.
Describes various number systems and their usage for Deep Neural Network hardware implementation.
This book provides readers with a comprehensive introduction to alternative number systems for more efficient representations of Deep Neural Network (DNN) data. Various number systems (conventional/unconventional) exploited for DNNs are discussed, including Floating Point (FP), Fixed Point (FXP), Logarithmic Number System (LNS), Residue Number System (RNS), Block Floating Point Number System (BFP), Dynamic Fixed-Point Number System (DFXP) and Posit Number System (PNS). The authors explore the impact of these number systems on the performance and hardware design of DNNs, highlighting the challenges associated with each number system and various solutions that are proposed for addressing them.
True PDF