all, I need to compute how much time a function elapses for a range of numbers of data, e.g. 100, 200, 300, 400. This is my code:
// start timer
time (&start);
someFun(some parameters);
// end timer
time (&end);
dif = difftime(end,start);
cstrTime.Format( _T("It took you %.6lf seconds to finish prediction.\n"),dif);
AfxMessageBox(cstrTime);
now, the problem is no matter how large the size for the input of 'someFun' is, it always returns me 1.000000 sec. So I am wondering maybe I should go for a more precise timer? any idea guys?
cheers