我需要计算一个算法在不实际运行代码的情况下所花费的大致时间。
我实际上不能让完整的算法运行,因为它需要几天或几周才能完成,具体取决于硬件。该算法本质上是对数的。以下是算法的估计。当然,这里不包含任何逻辑。
[n]
我们从 2的[n]
大数的幂开始。
int baseTwo = 2;
double log = 0D;
BigInteger number = 0;
double exponent = 5000000; // 5,000,000.
while (exponent > 0)
{
number = BigInteger.Pow(baseTwo, (int) exponent); // [baseTwo]=2 raised to the power [exponent].
number = this.ProcessNumber(number, baseTwo); // Returned number will be slightly smaller than what went in.
exponent = BigInteger.Log(number, baseTwo); // The Base 2 Log to calculate the slightly decreased exponent (if exponent was 38762, then the result would be 38761.4234 for example).
}
private BigInteger ProcessNumber(BigInteger number)
{
double rand = 0;
BigInteger result = 0;
rand = Random.Next(51, 100) / 100D; // Anywhere between 51% to 99%.
result = number * rand; // [result] will always be less than [number] but more than half of [number].
return (result);
}
由于指数向零迭代,每次迭代的时间自然会从一次迭代到下一次迭代减少。
- 给定我机器上第一次和最后一次迭代的执行时间,有没有办法计算总时间?
- 如果不是,我们可以对 [指数] 采取谨慎的范围,例如 5,000,000、4,500,000、4,000,000 等,然后从那里计算?