HyperAI

Normalization

NormalizationBy mapping data to a specified range, it is used to remove the dimensions and dimensional units of data of different dimensions and improve the comparability between different data indicators. Common mapping ranges are [0, 1] and [-1, 1].

Normalization algorithm

  • Linear transformation: $latex y = ( x – min ) / ( max – min )$
  • Logarithmic function transformation:$latex y = log_{10} ( x ) $
  • Inverse cotangent function conversion: $latex y = arctan ( x ) * 2 / π$

Comparison of normalization and other algorithms

The main data processing methods currently include normalization, standardization and regularization.

  • Normalization is to eliminate the dimensions between different data to facilitate data comparison and joint processing;
  • Standardization is a transformation such as data scaling that is performed to facilitate the next step of data processing;
  • Regularization is the use of prior knowledge to introduce regularization factors in the processing process to increase the role of guiding constraints.
Related words: standardization, regularization