Note that you can run the animation multiple times by clicking on "Run" again.
As you can see, the neuron rapidly learns a weight and bias that drives down the cost, and gives an output from the neuron of about .09$.
In a similar way, up to now we've focused on understanding the backpropagation algorithm.
It's our "basic swing", the foundation for learning in most work on neural networks.
The example involves a neuron with just one input: We'll train this neuron to do something ridiculously easy: take the input $1$ to the output $0$.
Of course, this is such a trivial task that we could easily figure out an appropriate weight and bias by hand, without using a learning algorithm.
That's not quite the desired output, $0.0$, but it is pretty good.