When I run the following program:
#include <stdio.h>
#include <math.h>
int main()
{
double sum, increase;
long amount, j;
printf("sum = ");
scanf("%lf", &sum);
printf("increase = ");
scanf("%lf", &increase);
printf("amount = ");
scanf("%ld", &amount);
for (j = 1; j <= amount; j++)
{
sum += increase;
}
printf("%lf\n", sum);
return 0;
}
I obtain the following response for these values:
MacBook:c benjamin$ ./test
sum = 234.4
increase = 0.000001
amount = 198038851
432.438851
MacBook:c benjamin$ ./test
sum = 234.4
increase = 0.000001
amount = 198038852
432.438851
MacBook:c benjamin$ ./test
sum = 234.4
increase = 0.000001
amount = 198038853
432.438852
where I have increased the variable 'amount' by 1 in each case.
- In the first one, the summation gives what I expect.
- In the second, it surprisingly gives the same value.
- In the third, it goes on summing.
Why does this happen?
Although the code doesn't seem to be very useful, I have just written the part in question. I actually wanted to use it in a larger program.
Thanks!