Multivariable calculus: Neural networks

Introduction to neural networks #

At a very fundamental level, neural networks are a novel way of constructing functions. Their use in data analysis and machine learning is ubiquitous. In the following labs and activities, we will answer the following questions:

  • What is a mathematical neuron and how are they put together into a neural network?
  • How can a function be approximated by a neural network?
  • How can we find a neural network that will allow us to make predictions about data?

Neurons: Mathematical, or artificial, neurons, are the basic computational units in a neural network. We will learn how to construct neurons, the design parameters involved, and what types of functions they represent.

Neural networks: In this exercise we will put neurons together to make neural networks, interpret the results as functions, and learn how to construct neural networks that approximate certain functions.

Chain rule: The chain rule plays a crucial part in the theory of neural networks. We investigate how the output of a neural network depends on changes in the input, as well as changes in its weights and biases.

Gradient descent: We introduce another way of finding the maximum and minimum values of a function which works when more traditional methods fail.

math 2603

Computer vision and hand-written digits: We use neural networks to solve one of the oldest image recognition problems, how to automatically recognize a handwritten digit.