smcder
Paranormal Adept
@Soupie
And this part is interesting too ...
So early work on the perceptron was affected by Minsky and Paperts book criticizing the limits of the perceptron (for example a single perceptron can model logic gates but not an XOR gate) - however, my reading "Talking Nets: An Oral History of Neural Networks" indicates that Minsky and Papert knew that perceptrons could be linked together to form an XOR gate and that neural networks held promise for AI and that they wrote the article to give support to digital electronic gates (integrated logic gates) because they were more involved with that technology (I think that's right) and some claim that set off the first "AI Winter" by killing off work on neural nets - but others say work went on with neural networks and backpropagation...and it is only recently that they have paid off because only recently do we have the computational power to simulate large neural networks ... but see the paper above that critiques even the newest deep learning, network approaches and consider how much less energy the human brain uses to do all sorts of things that simulated neural networks cannot.
And this part is interesting too ...
AI winter - Wikipedia
en.wikipedia.org
So early work on the perceptron was affected by Minsky and Paperts book criticizing the limits of the perceptron (for example a single perceptron can model logic gates but not an XOR gate) - however, my reading "Talking Nets: An Oral History of Neural Networks" indicates that Minsky and Papert knew that perceptrons could be linked together to form an XOR gate and that neural networks held promise for AI and that they wrote the article to give support to digital electronic gates (integrated logic gates) because they were more involved with that technology (I think that's right) and some claim that set off the first "AI Winter" by killing off work on neural nets - but others say work went on with neural networks and backpropagation...and it is only recently that they have paid off because only recently do we have the computational power to simulate large neural networks ... but see the paper above that critiques even the newest deep learning, network approaches and consider how much less energy the human brain uses to do all sorts of things that simulated neural networks cannot.