Monday, August 13, 2007

Kohonen Networks

Check out this page which gives the lerning algorithjm very clearly followed by a demonstration which gives an idea about the networks.
http://www.cs.bham.ac.uk/~jlw/sem2a2/Web/Kohonen.htm

Monday, August 6, 2007

Meaning of the term Stochastic

Stochastic, from the Greek "stochos" or "aim, guess", means of, relating to, or characterized by conjecture and randomness. A stochastic process is one whose behavior is non-deterministic in that a state does not fully determine its next state.

Stochastic Neural Networks

Stochastic neural networks are a type of artificial neural networks, which is a tool of artificial intelligence. They are built by introducing random variations into the network, either by giving the network's neurons stochastic transfer functions, or by giving them stochastic weights. This makes them useful tools for optimization problems, since the random fluctuations help it escape from local minima.Stochastic neural networks that are built by using stochastic transfer functions are often called Boltzmann machine.

Simulated Annealing

Simulated annealing (SA) is a generic probabilistic meta-algorithm for the global optimization problem, namely locating a good approximation to the global optimum of a given function in a large search space.

The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one.By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution; if the new solution is better, it is chosen, whereas if it is worse, it can still be chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods.

Sunday, August 5, 2007

Statistical Learning - Bayesian Logic

Bayesian logic


Named for Thomas Bayes, an English clergyman and mathematician, Bayesian logic is a branch of logic applied to decision making and inferential statistics that deals with probability inference: using the knowledge of prior events to predict future events.

Bayes' Theorem is a means of quantifying uncertainty. Based on probability theory, the theorem defines a rule for refining an hypothesis by factoring in additional evidence and background information, and leads to a number representing the degree of probability that the hypothesis is true. To demonstrate an application of Bayes' Theorem, suppose that we have a covered basket that contains three balls, each of which may be green or red. In a blind test, we reach in and pull out a red ball. We return the ball to the basket and try again, again pulling out a red ball. Once more, we return the ball to the basket and pull a ball out - red again. We form a hypothesis that all the balls are all, in fact, red. Bayes' Theorem can be used to calculate the probability (p) that all the balls are red (an event labeled as "A") given (symbolized as "|") that all the selections have been red (an event labeled as "B"):

p(A|B) = p{A + B}/p{B}

Of all the possible combinations (RRR, RRG, RGG, GGG), the chance that all the balls are red is 1/4; in 1/8 of all possible outcomes, all the balls are red AND all the selections are red. Bayes' Theorem calculates the probability that all the balls in the basket are red, given that all the selections have been red as .5 (probabilities are expressed as numbers between 0. and 1., with "1." indicating 100% probability and "0." indicating zero probability).

Thursday, August 2, 2007

Preliminary Study Material

Just check this out. Gives really simple explanation with good examples to get a preliminary grasp of the subject.

http://richardbowles.tripod.com/neural/neural.htm

Wednesday, August 1, 2007

Quiz

Check out this.
This is a small Quiz on Single layer Perceptrons.
Interesting because it is easy!

http://www.cs.bham.ac.uk/~rxb/HTML_text/nnets/3/3.html#9