• Jayjader
    link
    fedilink
    arrow-up
    4
    ·
    2 months ago

    I wonder what other applications this might have outside of machine learning. I don’t know if, for example, intensive 3d games absolutely need 16bit floats (or larger), or if it would make sense to try using this “additive implementation” for their floating point multiplicative as well. Modern desktop gaming PCs can easily slurp up to 800W.

    • DdCno1@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      I have a vague memory of a 3D engine from the 1990s using an approach like this, but I’m not entirely sure.

      • IrritableOcelot@beehaw.org
        link
        fedilink
        arrow-up
        5
        ·
        2 months ago

        I think you’re thinking of the famous fast inverse square root algorithm from Quake.

        With respect to the top comment, the only reason 3d graphics are possible (even at 850W of power consumption) is due to taking a bunch of shortcuts and approximations like culling of polygons. If its a reasonable shortcut it either has or will be taken.