Hebb’s rule: is a neuroscience theory where an increase in synaptic efficacy arises from the presynaptic cell’s repeated and persistent stimulation of the postsynaptic cel (…) Hebbian theory concerns how neurons might connect themselves to become engrams (=means by which memories are stored thus biophysical/biochemical changes in the brain in response to external stimuli) (…) The theory attempts to explain associative or Hebbian learning, in which simultaneous activation of cells leads to pronounced increases in synaptic strength between those cells, and provides a biological basis for errorless learning methods for education and memory rehabilitation. In the study of neural networks in cognitive function, it is often regarded as the neuronal basis of unsupervised learning.
Back-propagation: is a method used in artificial neural networks to calculate the error contribution of each neuron after a batch of data (in image recognition, multiple images) is processed [=computing systems inspired by the biological neural networks that constitute animal brains, these systems learn to do tasks by considering examples](…) Backpropagation is sometimes referred to as deep learning, a term used to describe neural networks with more than one hidden layer (layers not dedicated to input or output)
Boltzmann machine: is a type of stochastic recurrent neural network [a stochastic or random process is a mathematical object usually defined as a collection of random variables] (…) They were one of the first neural networks capable of learning internal representations, and are able to represent and (given sufficient time) solve difficult combinatoric problems (…) Boltzmann machines with unconstrained connectivity have not proven useful for practical problems in machine learning or inference, but if the connectivity is properly constrained, the learning can be made efficient enough to be useful for practical problems
References & Images