The Artificial Neuron
History
Comparison
Architecture
Applications
Future
Sources
Neural Network Header
What does it mean to learn?
This brings us into the overriding philosophical question: what does it mean to learn? Is learning in humans simply a bunch of cascading if/else statements that, when we learn, modify themselves to create new combinations? Or does interacting with the environment change the arrangement and intensity of neurons to form a bottom-up approach?

It seems as though the human brain has a combination of each. We do not have to learn that if we're hungry we have to eat or if we're tired, we need sleep. These things seem to be hardwired into the human brain. Other things, such as reading a book, come only with interacting with the environment and the society around an individual.

It seems likely that many applications of neural networks in the future would have to use a combination of instructions hardwired into the system and a neural network. For example, the chess machine that Arthur Samuel built did not start with a blank slate and learn how to play checkers over time; one machine was provided with the knowledge of a checkers book--things like which moves are ideal and which moves fail miserably.

Proceed to the comparison between conventional computing and artificial neural networks
Back to the bottom-up approach

Back to Comparison