A library of parameterized floating-point modules and their use

P. Belanovic and M. Leeser

Nominated by: vaughnbetz@gmail.com

This paper introduced an open-source library of floating point cores with parameterized precision (mantissa and exponent), which much further research has leveraged. Different floating point representations have become even more important recently with the advent of machine learning (esp. CNNs) using reduced precision representations, such as the Microsoft Brainwave fp8 and fp9 representations.

Leave a Comment