So basically I have a list of 300 values and different averages associated with each one.
I have a for
-loop that generates a list of ten of these values at random, and writes it to excel if certain conditions are met based on their averages.
The code runs fine if I loop through 10 million times or less, but that is orders of magnitudes too small. Even if I just double the for loop counter to 20 million my computer becomes unusable while it is running.
I want to iterate the loop 100 million or 1 billion times even. I want it to run slowly in the background, I don't care if it takes 24 hours to get to the results. I just want to use my computer while it's working. Currently, if the for loop goes past 10 million the memory and disk usage of my laptop go to 99%.
Using pyScripter
and python 3.3
Comp specs: Intel Core i7 4700HQ (2.40GHz) 8GB Memory 1TB HDD NVIDIA GeForce GTX 850M 2GB GDDR3
Code snippet:
for i in range( 0, cycles ):
genRandLineups( Red ); #random team gens
genRandLineups( Blue );
genRandLineups( Purple );
genRandLineups( Green );
if sum( teamAve[i] ) <= 600
and ( ( sum( teamValues[i] ) > currentHighScore )
or sum( teamValues[i] ) > 1024
):
teamValuesF.append( teamValues[i] )
sheetw.write( q, 0, str( teamValues[i] ) )
ts = time.time()
workbookw.save( "Data_Log.xls" )
st = datetime.datetime.fromtimestamp( ts ).strftime( '%Y-%m-%d %H:%M:%S' )
sheetw.write( q, 3, st )
q = q + 1
if sum( teamValues[i] ) > currentHighScore:
currentHighScore = sum( teamValues[i] )