Google Classroom
GeoGebraGeoGebra Classroom

What the Heck are Artificial Neural Networks?

Artificial Neural Networks (ANNs) are a specific tool for implementing ML. Mathematically, ANNs are composite functions made up of simple mathematical functions like linear () and exponential () functions. Calculus is used to "train" these composite functions to do various tasks. In the applet below you can see an ANN perform machine learning with the push of a button. Each time you press the button, backpropagation "learns" how to select weights (w's) to get and to better target and . Each press of the button logs one epoch (or learning cycle). Backpropagation is the name for the calculus optimization problem the network is solving. TRY IT! If anything goes wrong, click the refresh button in the top right of the applet. We'll discuss what's going on as a group.

How many epochs does it take to get the calculated outputs to within 2 decimal places of the targets?

Reset the app with the button in the top right. If the learning rate is increased to 5, how many epochs does it take to get the calculated outputs to within 2 decimal places?

OPTIONAL Technical Terminology:

The inputs together with your targets are often called training data. The biases are optional fixed numbers that get added to the net's. They are user-specified parameters, and are not considered "learnable". They are set to 0 in the applet for simplicity. The E's on the right are errors (or loss), or a measurement of how close the computed outputs are to your targets. In this example we're using mean square error for an error. The learning rate adjusts the impact of each epoch of backpropagation on the weights. Higher learning rates have a bigger impact. Careful though! A learning rate too high might accidentally jump your ANN into a very bad error!

OPTIONAL Technical details about the applet:

An ANN takes input data, and uses it compute outputs that target numbers you specify. An ANN usually starts its life doing a very bad job of computing outputs close to your targets, but ANNs use calculus to "learn" how to get their calculated outputs closer to your specified targets. You're can see an ANN below. I know it looks insanely complicated, but I promise it's not too bad. :) There are two inputs, and . Do you see them on the left? You can change them if you want, but don't just yet. There are two targets, and . Do you seem them on the right? You can also change these, but don't just yet. The two outputs the ANN has computed are and . Do you seem them in the gray boxes? You can't change these. These have been computed by the ANN. You don't need to know too much about what's in between (all the w's and net's and 1/(1+e^-x)'s), but I'll talk about them a little bit. If anyone's curious you can read more here. The cool thing about ANNs is that they use calculus to LEARN how to adjust the w's to get the computed outputs, and , closer to your targets, and . It's called backpropagation, and each iteration of backpropagation is called an epoch.