2nd Edition. — CRC Press, 2025. — 339 p. — (Chapman & Hall/CRC Mathematics and Artificial Intelligence Series). — ISBN: 978-1-032-71596-4.
Artificial Intelligence: An Introduction to Big Ideas and their Development, Second Edition guides readers through the history and development of Artificial Intelligence (AI), from its early mathematical beginnings through to the exciting possibilities of its potential future applications. To make this journey as accessible as possible, the authors build their narrative around accounts of some of the more popular and well-known demonstrations of artificial intelligence, including Deep Blue, AlphaGo and even Texas Hold’em, followed by their historical background, so that AI can be seen as a natural development of the mathematics and Computer Science of AI. As the book proceeds, more technical descriptions are presented at a pace that should be suitable for all levels of readers, gradually building a broad and reasonably deep understanding and appreciation for the basic mathematics, physics, and computer science that is rapidly developing Artificial Intelligence as it is today.
A Large Language Model (LLM) is essentially a giant artificial neural network (ANN) that can have trillions of parameters and perform accelerated self-supervised learning from data searched from the Internet using an improved Common Crawl search engine. LLMs take an input text and predict the next token, much like natural language processing (NLP), taking an input text and repeatedly predicting the next token or word by various means, such as Hidden Markov Models (HMM) and recurrent neural networks (RNN) and their associated algorithms. The generative output of the model assumes its linear dependence.
its previous values and a stochastic term to form a recurrence relational autoregressive model (automatically going back) with discriminative (classification demarcation) and NLP fine-tuning to compose prose and poetry, translate, and generate images, music, computer programs, and almost any literal thing that a user requests (now in text and more lately through speech).
FeaturesThe only mathematical prerequisite is an elementary knowledge of calculus.
Accessible to anyone with an interest in AI and its mathematics and computer science.
Suitable as a supplementary reading for a course in AI or the History of Mathematics and Computer Science regarding artificial intelligence.
New to the Second Edition.Fully revised and corrected throughout to bring the material up-to-date.
Greater technical detail and exploration of basic mathematical concepts, while retaining the simplicity of explanation of the first edition.
Entirely new chapters on large language models (LLMs), ChatGPT, and quantum computing.
Computing Hardware.
The Integrated Circuit.
Software.
Open-Source Software.
Expert Systems.
Inverted Decision Trees.
Deep Blue.
Jeopardy and Miss Debater.
The Perceptron.
Parameterization.
Gradient Descent and Backpropagation.
The Cross-Entropy Cost Function.
Convolutional Neural Networks.
Imagenet and Model Fitting.
Markov Chain Monte Carlo Simulation.
Reinforcement Learning.
AlphaGo.
Game Theory.
Predictive Analytics.
Support Vector Machines.
Top-Down Speech Recognition.
Bottom-Up Speech Recognition.
Speech Synthesis.
RBMs, GANs, and LFCF.
LLMs and GPTs.
Massive Parallel Processing and Supercomputers.
Quantum Computing.
Industrial Robots: Robot Physicians.
Autonomous Vehicles.
Exoplanet/Exomoon Astronomer.
Protein Folding.
Intelligence.
The AI Singularity.