1

I've made a small application that averages the numbers between 1 and 1000000. It's not hard to see (using a very basic algebraic formula) that the average is 500000.5 but this was more of a project in learning C++ than anything else.

Anyway, I made clock variables that were designed to find the amount of clock steps required for the application to run. When I first ran the script, it said that it took 3770000 clock steps, but every time that I've run it since then, it's taken "0.0" seconds...

I've attached my code at the bottom.

Either a.) It's saved the variables from the first time I ran it, and it's just running quickly to the answer... or b.) something is wrong with how I'm declaring the time variables.

Regardless... it doesn't make sense.

Any help would be appreciated.

FYI (I'm running this through a Linux computer, not sure if that matters)

double avg (int arr[], int beg, int end)
{
    int nums = end - beg + 1;

    double sum = 0.0;

    for(int i = beg; i <= end; i++)
    {
        sum += arr[i];
    }   

    //for(int p = 0; p < nums*10000; p ++){}

    return sum/nums;

}

int main (int argc, char *argv[]) 
{ 
    int nums = 1000000;//atoi(argv[0]);
    int myarray[nums];

    double timediff;

    //printf("Arg is: %d\n",argv[0]);
    printf("Nums is: %d\n",nums);

    clock_t begin_time = clock();

    for(int i = 0; i < nums; i++)
    {
        myarray[i] = i+1;
    }

    double average = avg(myarray, 0, nums - 1);

    printf("%f\n",average); 

    clock_t end_time = clock();

    timediff = (double) difftime(end_time, begin_time);

    printf("Time to Average: %f\n", timediff);

    return 0;

}    
4

2 回答 2

1

您也在测量 I/O 操作 (printf),这取决于外部因素并且可能会影响运行时间。此外,clock() 可能不如测量这样一个小任务所需的精确 - 查看更高分辨率的函数,例如 clock_get_time()。即使这样,其他进程也可能通过产生缺页中断和占用内存BUS等影响运行时间。所以这种波动一点也不异常。

于 2013-07-15T20:09:36.837 回答
0

在我测试的机器上,Linux 的clock调用只精确到 1/100 秒。如果您的代码在不到 0.01 秒内运行,它通常会说零秒已经过去。此外,我在 0.13 秒内总共运行了您的程序 50 次,所以我怀疑您声称在您的计算机上运行一次需要 2 秒。

您的代码错误地使用了difftime,如果时钟说时间确实过去了,它也可能显示不正确的输出。

我猜你得到的第一个时间是与这个问题中发布的不同的代码,因为我想不出这个问题中的代码可以产生3770000的时间。

最后,基准测试很难,您的代码有几个基准测试错误:

  • 您正在计时(1)填充数组,(2)计算平均值,(3)格式化结果字符串(4)进行操作系统调用(慢)以正确的语言/字体打印所述字符串/colo/etc,特别慢。
  • 您正在尝试对耗时不到百分之一秒的任务进行计时,这对于任何精确测量来说都太小了。

这是我对您的代码的看法,测量这台机器上的平均时间约为 0.001968 秒

于 2013-07-15T20:21:07.517 回答