Definitions
from The American Heritage® Dictionary of the English Language, 4th Edition
 adj. Of, relating to, or being a method of writing numeric quantities with a mantissa representing the value of the digits and a characteristic indicating the power of the number base, such as 3 × 105.
Etymologies
Sorry, no etymologies found.
Examples

Results are frequently expressed in "petaflops," indicating that a quadrillion floatingpoint operations—a type of scientific calculation—are completed each second.

Based on data submitted so far, Mr. Dongarra said, the Sunway machine has "credible" sustained performance of about 795 teraflopsor trillion floatingpoint operations per second, a measure of scientific calculations based on standard tests.

Unlike today's supercomputers, with speeds that are measured in petaflops  a quadrillion sustained floatingpoint operations per second  the next generation will be measured in exaflops  a quintillion, or one million trillion floating point operations per second.
Eric D. Isaacs: Why America Must Win the Supercomputing Race

Unlike today's supercomputers, with speeds that are measured in petaflops a quadrillion sustained floatingpoint operations per second  the next generation will be measured in exaflops  a quintillion, or one million trillion floating point operations per second.
Eric D. Isaacs: Why America Must Win the Supercomputing Race

In fact, I've rambled at length on my theories of subgenre boundaries  you see, subgenres are floatingpoint variables, not binaries.
MIND MELD: What's Your Favorite SubGenre of Science Fiction and/or Fantasy?

For example, Excel uses floatingpoint math, which can sometimes give people disconcerting results.

The way you add two integers is quite different to the way you add two floatingpoint numbers, or a float and an integer, or two matrices.

You have to manage your own storage, and it's up to you to preserve type safety; code sequences such as 123 3.0 + integer addition when there's a floatingpoint number on top of the stack have undefined behaviour.

Intel's Pentium chips sported a tiny error in floatingpoint calculation that led to a product recall.

In economics, the parallel is this: If the unitary cost of technology ("per megabyte" or "per megabit per second" or "per thousand floatingpoint operations per second") is halving every 18 months, when does it come close enough to zero to say that you've arrived and can safely round down to nothing?
Comments
Log in or sign up to get involved in the conversation. It's quick and easy.