这是我的代码:
def new_badge(badge_classes=None, users=None):
"""
Utility function for awarding a badge when a new one is added.
"""
badge_classes = get_badges_classes() if badge_classes is None else badge_classes
total = users.count()
for i, user in enumerate(users.iterator()):
t0 = time.time()
flight_ids = []
for flight in user.flight_set.order_by('date').iterator():
flight_ids.append(flight.id)
flights_before = Flight.objects.filter(id__in=flight_ids)
for BadgeClass in badge_classes:
badge = BadgeClass(all_flights=flights_before, new_flight=flight)
badge.grant_if_eligible()
print "-- %s" % user.username
print "-- %.2f s" % (time.time() - t0)
print "-- %.2f%% done" %(float(i) / total * 100)
查询集中有 2700 个对象users
,超过 500,000 个航班。当我运行这个脚本时,我预计它会花费很长时间,但问题是它占用了越来越多的内存。几个小时后,脚本停止并出现以下错误:
File "/Users/chris/Documents/flightloggin2/badges/models.py", line 361, in eligible
c = countries.count()
File "/Library/Python/2.7/site-packages/django/db/models/query.py", line 351, in count
return self.query.get_count(using=self.db)
File "/Library/Python/2.7/site-packages/django/db/models/sql/query.py", line 418, in get_count
number = obj.get_aggregation(using=using)[None]
File "/Library/Python/2.7/site-packages/django/contrib/gis/db/models/sql/query.py", line 85, in get_aggregation
return super(GeoQuery, self).get_aggregation(using)
File "/Library/Python/2.7/site-packages/django/db/models/sql/query.py", line 384, in get_aggregation
result = query.get_compiler(using).execute_sql(SINGLE)
File "/Library/Python/2.7/site-packages/django/db/models/sql/compiler.py", line 818, in execute_sql
cursor.execute(sql, params)
File "/Library/Python/2.7/site-packages/django/db/backends/util.py", line 40, in execute
return self.cursor.execute(sql, params)
File "/Library/Python/2.7/site-packages/django/db/backends/postgresql_psycopg2/base.py", line 52, in execute
return self.cursor.execute(query, args)
django.db.utils.DatabaseError: could not create temporary file "base/pgsql_tmp/pgsql_tmp98246.932828": No space left on device
我怎样才能解决这个问题?当我运行脚本时,top
告诉我正在运行脚本的 python 进程有 10G 内存,所以我认为这是 python 端的东西,而不是 postgres,但我不完全确定。