The foundation of the complex neural networks we have today is the perceptron.

A perceptron is a theoretical model of a neural system that explains how sensory and memory information about the environment is stored, influencing behavior and recognition (Rosenblatt, 1958).

A perceptron takes several binary inputs x1,x2,…, and produces a single binary output:

Screenshot 2024-12-23 at 18.09.29.png

Image. Visualization of a perceptron (Nielsen, 2015)

In their paper in 1958, Rosenblatt introduced weights, w1, w2,… to the perceptron model (Rosenblatt, 1958).

Weights are real numbers that represent the significance of each respective input to the output.. The perceptron calculates a weighted sum, which is the total obtained by multiplying each input value (xj) by its corresponding weight (wj) (Nielsen, 2015).

Screenshot 2024-12-23 at 18.21.53.png

Image. Mathematical equation of a weighted sum in a perceptron

Latex code for the equation:

\\text{Weighted Sum} = \\sum_{j=1}^{n} w_j x_j = w_1 x_1 + w_2 x_2 + w_3 x_3 + \\dots + w_n x_n

A threshold is a real number that determines whether the output of a perceptron is 0 or 1:

This behavior could be represented by the following equation (Nielsen, 2015):

Screenshot 2024-12-23 at 18.27.18.png

Image. Mathematical equation describing how a perceptron determines its output based on whether the weighted sum of inputs exceeds a threshold

Latex code for the equation:

\\text{output} =
\\begin{cases} 
0 & \\text{if } \\sum_{j} w_j x_j \\leq \\text{threshold} \\\\
1 & \\text{if } \\sum_{j} w_j x_j > \\text{threshold}
\\end{cases}

By weighing up different evidences, perceptron creates an output.