Neural circuit
A neural circuit is a population of neurons interconnected by synapses to carry out a specific function when activated. Multiple neural circuits interconnect with one another to form large scale brain networks. The first rule of neuronal learning was described by Hebb in 1949, in the Hebbian theory. In 1959, the neuroscientists, Warren Sturgis McCulloch and Walter Pitts published the first works on the processing of neural networks. They showed theoretically that networks of artificial neurons could implement logical, arithmetic, and symbolic functions. The connections between neurons in the brain are much more complex than those of the artificial neurons used in the connectionist neural computing models of artificial neural Networks. The basic kinds of connections between neuron are synapses: both chemical and electrical synapses. The establishment of synapses enables the connection of neurons into millions of overlapping, and interlinking neural circuits. Backpropagating action potentials cannot occur because after an action potential travels down a given segment of the axon, the m. gates on voltage-gated sodium channels close, thus blocking any transient opening. In some cells, neural backpropagation does occur through dendritic branching.