0 Replies Latest reply on Mar 26, 2010 2:35 PM by genez

    QueryPerformanceCounter on 5570 versus 5160 CPU

      am running an identical piece of code on 2 different machines.

      All the code does is call QueryPerformanceFrequency function and printing out result ..

      1st. Dell 1950 with 2 Dual Core Xeons 5160 running Server 2003.

      Current Frequency 2993000000

      2nd Dell R610 with 2 Quad core Xeon 5570 running server core 2008 R2.

      Current Frequency 2857480

      Is there a different way Server Core 2008 reports cpu ticks ?  Is resolution now down to microseconds instead of nanoseconds ?

      Is there a different way to calculate ticks per second .. microsecond .. nanosecond .. ?

       

      Does this mean that i have to accept the fact that latest Intel CPU  reports ticks at 2.857xx per microsecond.

      giving me the best resolution possible to be 1/3 of a microsecond. while older CPUs were giving me 2993 ticks per microsecond...

      So much for improvement.

       

       

      Please Help,

      Thanks

      P.S.

      The code is compiled to 64 bit platform using Visual Studio 2005.

      LARGE_INTEGER ticksInSecond;

      QueryPerformanceFrequency(&ticksInSecond);

      sprintf_s(bf,

      sizeof(bf), "Current Frequency %I64d ",ticksInSecond);

      sizeof(bf), "Current Frequency %I64d ",ticksInSecond);