HyperAI

Threshold Logic Unit

Threshold Logic Unit TLU is the basic unit of a neural network, and its schematic diagram is as follows:

Each input value and the corresponding weight are multiplied and summed. If the sum is greater than the TLU threshold, the output is 1, otherwise the output is 0. A single TLU can be used for simple action calculations, but to form a neural network, components such as TLU are required.

The threshold logic unit is actually a network structure composed of two layers of neurons. It receives external input at the input layer and transmits the signal to the output layer through the transformation of the activation function. Therefore, it is called the "threshold logic unit."