HyperAI

Floating Point Operations Per Second FLOPS

Floating point operations per second (FLOPS) is a measure of computer performance based on the number of floating point arithmetic calculations that a processor can perform in one second.Floating-point arithmetic is a term used in computing to describe the type of calculations performed on floating-point representations of real numbers. It is a measure of hardware performance. In layman's terms, it is the computing power of graphics cards, which corresponds to the "GPU computing power ranking" on NVIDIA's official website.

The use of floating point numbers enables computers to handle extremely long numbers and their varying degrees of precision. Floating point numbers are usually expressed using base 2 rather than base 10, and can be represented using a notation similar to scientific notation. Floating point notation is usually represented by the following formula (or something similar):

Sign*Mantissa*Radixindex

The sign placeholder indicates whether the number is positive or negative. Its value is +1 or -1. The mantissa is the core number, usually a decimal number such as 6.901. The radix is the number base, which in floating-point representation is base 2. The exponent is applied to the base. For example, the following floating-point equation represents the decimal number -51,287,949.31:

-1 * 3.057 * 2 24

Use of FLOPS

Performing workloads such as scientific computing, advanced analytics, or 3D graphics processing often requires floating-point operations. Computers running these workloads are often measured in FLOPS, which provides a way to measure a computer's performance and compare it to other computers.

When FLOPS is used as a performance metric, it is often expressed with a prefix multiplier such as tera or peta, as in teraFLOPS or petaFLOPS.

References

【1】https://www.techtarget.com/whatis/definition/FLOPS-floating-point-operations-per-second