我制作了相同的程序来测试 Mac OS X 上 Nodejs 和 C++ 的性能。
首先在 C++ 中:
#include <iostream>
#include <time.h>
using namespace std;
int main() {
clock_t t1, t2;
cout << "Initializing\n";
t1 = clock();
double m = 0;
for (double i = 0; i != 10000000000; ++i) {
m = i * -1 + i;
}
t2 = clock();
float diff = (((float) t2 - (float) t1) / 1000000.0F) * 1000;
cout << "Finalizing with " << diff << "ms\n";
}
Nodejs中的第二个:
console.log("Initializing");
t1 = Date.now();
var m = 0;
for (var i = 0; i != 10000000000; i++) {
m = i * -1 + i;
}
t2 = Date.now();
var diff = t2 - t1;
console.log("Finalizing with %dms", diff);
结果是 C++ 为 50000 毫秒,Nodejs 为 22000 毫秒。
为什么 Nodejs 对于这种操作更快?
谢谢。
更新:
切换 double 并使用 long int,它给了我 22000 毫秒,就像 Nodejs 一样。