Tuesday, July 17, 2007

Neural Network and Connectionist Models

The human brain is an incredibly impressive information processor, even though it "works" quite a bit slower than an ordinary computer. Many researchers in artificial intelligence look to the organization of the brain as a model for building intelligent machines.
Think of a sort of "analogy" between the complex webs of interconnected neurons in a brain and the densely interconnected units making up an artificial neural network (ANN), where each unit--just like a biological neuron--is capable of taking in a number of inputs and producing an output. Consider this description: "To develop a feel for this analogy, let us consider a few facts from neurobiology. The human brain is estimated to contain a densely interconnected network of approximately 10 ^ 11 neurons, each connected, on average, to 10 ^ 4 others. Neuron activity is typically excited or inhibited through connections to other neurons. The fastest neuron switching times are known to be on the order of 10 ^ -3 seconds---quite slow compared to computer switching speeds of 10 ^ -10 seconds.
Yet humans are able to make surprisingly complex decisions, surprisingly quickly. For example, it requires approximately 10^ -1 seconds to visually recognize your mother. Notice the sequence of neuron firings that can take place during this 10^ -1-second interval cannot possibly be longer than a few hundred steps, giving the switching speed of single neurons. This observation has led many to speculate that the information-processing abilities of biological neural systems must follow from highly parallel processes operating on representations that are distributed over many neurons. One motivation for ANN systems is to capture this kind of highly parallel computation based on distributed representations."

[From Machine Learning (Section 4.1.1; page 82) by Tom M. Mitchell, McGraw Hill Companies, Inc. (1997).]

No comments: