所以我试图使用 std::chrono::high_resolution_clock 来计算执行需要多长时间。我想你可以找到开始时间和结束时间之间的差异......
为了检查我的方法是否有效,我编写了以下程序:
#include <iostream>
#include <chrono>
#include <vector>
void long_function();
int main()
{
std::chrono::high_resolution_clock timer;
auto start_time = timer.now();
long_function();
auto end_time = timer.now();
auto diff_millis = std::chrono::duration_cast<std::chrono::duration<int, std::milli>>(end_time - start_time);
std::cout << "It took " << diff_millis.count() << "ms" << std::endl;
return 0;
}
void long_function()
{
//Should take a while to execute.
//This is calculating the first 100 million
//fib numbers and storing them in a vector.
//Well, it doesn't actually, because it
//overflows very quickly, but the point is it
//should take a few seconds to execute.
std::vector<unsigned long> numbers;
numbers.push_back(1);
numbers.push_back(1);
for(int i = 2; i < 100000000; i++)
{
numbers.push_back(numbers[i-2] + numbers[i-1]);
}
}
问题是,它只是精确地输出 3000 毫秒,而实际上显然不是这样。
在较短的问题上,它只输出 0ms ......我做错了什么?
编辑:如果有任何用处,我正在使用带有 -std=c++0x 标志的 GNU GCC 编译器