Statistics in charts and tables are reported in a variety of units. This article explains the units in general terms. A more technical explanation is available here.
## Flops, GFlops, and TFlopsFLOPS is a standard measure of computing power -- "Floating Point Operations Per Second". A GigaFlop (or Gflop) is a billion FLOPS. A TeraFlps (or TFlop) is a trillion FLOPS. So a 1 FLOP machine will do one "operation" in a second. A 1 GFlop machine will do a billion operations in a second. A 2 GFlop machine will do two billion operations in a second. ie, by this measure, a 2 GFlop machine will do twice as much computing work in the same time as a 1 GFlop machine. ## G-hoursA G-hour is the measure of computing work done by a one GFLOP machine in an hour (i.e., a G-hour is one billion FLOPS per second times 3,600 seconds in an hour which equals 3.6 trillion floating point operations, which is a lot of computing.) So in an hour’s work on a volunteer computing project, a machine that operates at one GFLOP will do one G-hour of work; and the same one hour’s contribution on a machine that operates at three GFLOPS (i.e., a faster computer) would complete three G-hours of work. ## CPU Hours and CPU YearsA CPU is a computer's processor ("Central Processing Unit"). The term A with G-hours, note that a CPU-hour is is not a direct measure of time. Because some computers are faster than others, CPU Hours are reported in terms of a 1 GFLOP reference machine". That is, An hour of work on a machine twice as fast as the reference machine is credited as Two CPU Hours. A ## CreditsIn place of G-hours, CPU Hours or CPU Years, you will in some places see the term One Credit is 1/100 of a day (14.4 minutes) of computing time on a 1 GFLOP reference machine. Categories: Help |
|||||