0

all, I need to compute how much time a function elapses for a range of numbers of data, e.g. 100, 200, 300, 400. This is my code:

// start timer
time (&start);

someFun(some parameters);

// end timer
time (&end);
dif = difftime(end,start);

cstrTime.Format( _T("It took you %.6lf seconds to finish prediction.\n"),dif);
AfxMessageBox(cstrTime);

now, the problem is no matter how large the size for the input of 'someFun' is, it always returns me 1.000000 sec. So I am wondering maybe I should go for a more precise timer? any idea guys?

cheers

4

1 回答 1

2

difftime正如您在结果中看到的那样,分辨率为 seconds 。

您应该为此实现一个Win32 高分辨率计时器样品在这里

于 2013-05-12T20:48:13.393 回答