Beware of attaching too much meaning to the weights backprop finds, however.
2
That's what happens to backprop, except it climbs mountains in hyperspace instead of 3-D.
3
Then learn the connection weights using backprop, and your brand-new brain is ready to use.
4
And with backprop we can learn the appropriate weights, resulting in a successful Nike prospect detector.
5
This makes backpropagation-orsimply backprop-theconnectionists' master algorithm.
6
So does backprop solve the machine-learning problem?
7
Beyond that, backprop broke down.
8
Some connectionists have been overheard claiming that backprop is the Master Algorithm and we just need to scale it up.
9
Among other things, they showed that backprop can learn XOR, enabling connectionists to thumb their noses at Minsky and Papert.
10
In an early demonstration of the power of backprop, Terry Sejnowski and Charles Rosenberg trained a multilayer perceptron to read aloud.
11
In those days researchers didn't trust computer simulations; they demanded mathematical proof that an algorithm would work, and there's no such proof for backprop.
12
The Master Algorithm is neither genetic programming nor backprop, but it has to include the key elements of both: structure learning and weight learning.
13
When backprop first hit the streets, connectionists had visions of quickly learning larger and larger networks until, hardware permitting, they amounted to artificial brains.
14
But if it starts at 5.5, on the other hand, backprop will roll down to 7.0 and remain stuck there.
15
Backprop proceeds deterministically after setting the initial weights to small random values.
16
Backprop learns weights for a predefined network architecture; denser networks are more flexible but also harder to learn.