Principles of Artificial Intelligence Fall 2005 Handout

We start by introducing an extremely simplified model of a biological neuron, based on the early work of McCulloch and Pitts. This model is variously refered to as McCulloch-Pitts Neuron, Threshold Neuron, Threshold Logic Unit (TLU) and perceptron. What follows is a caricature that describes the working of nervous systems: A nervous system is an organized network of neurons. A typical neuron consists of three parts: the dendrites, the cell body, and the axon (also called the nerve fiber). The dendrites carry nerve signals toward the cell body, while the axon carries the signal away from the cell body. Neural circuits are formed from groups of neurons arranged with the end branches of the axon lying close to the dendrites of another neuron. In the human brain, each neuron can interact with thousands of other neurons. The point of contact between the components of two neurons is called a synapse. A small microscopic gap between the two neurons exists at a synapse. It is known that the ease of neural signal transmission across the synapse is altered by activity in the nervous system — a possible mechanism for learning. When a neuron receives input from other neurons, the electro-chemical processes involved cause its voltage to increase. When the voltage exceeds a certain threshold, it results in a volley of nerve impulses that travel down the axon (thereby stimulating other neurons and so on). Or, the neuron is said to fire. In McCulloch-Pitts model, numerical parameters called weights w1, · · · , wn model the strength of synaptic coupling between neurons. x1, · · · , xn model the inputs. T models the threshold. For mathematical convenience, T is replaced by a weight −w0 and a fictitious input x0 which is always constant (typically equal to unity) is introduced.