5

所以我试图使用 std::chrono::high_resolution_clock 来计算执行需要多长时间。我想你可以找到开始时间和结束时间之间的差异......

为了检查我的方法是否有效,我编写了以下程序:

#include <iostream>
#include <chrono>
#include <vector>

void long_function();

int main()
{
    std::chrono::high_resolution_clock timer;
    auto start_time = timer.now();

    long_function();

    auto end_time = timer.now();
    auto diff_millis = std::chrono::duration_cast<std::chrono::duration<int, std::milli>>(end_time - start_time);

    std::cout << "It took " << diff_millis.count() << "ms" << std::endl;
    return 0;
}

void long_function()
{
    //Should take a while to execute.
    //This is calculating the first 100 million
    //fib numbers and storing them in a vector.
    //Well, it doesn't actually, because it
    //overflows very quickly, but the point is it
    //should take a few seconds to execute.
    std::vector<unsigned long> numbers;
    numbers.push_back(1);
    numbers.push_back(1);
    for(int i = 2; i < 100000000; i++)
    {
        numbers.push_back(numbers[i-2] + numbers[i-1]);
    }
}

问题是,它只是精确地输出 3000 毫秒,而实际上显然不是这样。

在较短的问题上,它只输出 0ms ......我做错了什么?

编辑:如果有任何用处,我正在使用带有 -std=c++0x 标志的 GNU GCC 编译器

4

2 回答 2

2

high_resolution_clock 的分辨率取决于平台。

打印以下内容将使您了解您使用的实现的解决方案

    std::cout << "It took " << std::chrono::nanoseconds(end_time - start_time).count() << std::endl;
于 2012-12-11T16:48:46.300 回答
1

我在 window7 下的 g++ (rev5, Built by MinGW-W64 project) 4.8.1 遇到了类似的问题。

int main()
{
    auto start_time = std::chrono::high_resolution_clock::now();
    int temp(1);
    const int n(1e7);
    for (int i = 0; i < n; i++)
        temp += temp;
    auto end_time = std::chrono::high_resolution_clock::now();
    std::cout << std::chrono::duration_cast<std::chrono::nanoseconds>(end_time - start_time).count() << " ns.";
    return 0;
}

如果 n=1e7 则显示 19999800 ns,但如果 n=1e6 则显示 0 ns。

精度似乎很弱。

于 2013-09-17T08:39:12.740 回答