我正在寻找一种更快的方法来完成我的任务。我有 40000 个文件可下载的 url。我想在本地桌面上下载它们。现在我的想法是目前我正在做的是将链接放在浏览器上,然后通过脚本下载它们。现在我正在寻找的是在一个块中放置 10 个 url地址栏并同时获取要下载的10个文件。如果可能的话,希望整体时间会减少。
对不起,我迟到了代码,这里是:
def _download_file(url, filename):
"""
Given a URL and a filename, this method will save a file locally to the»
destination_directory path.
"""
if not os.path.exists(destination_directory):
print 'Directory [%s] does not exist, Creating directory...' % destination_directory
os.makedirs(destination_directory)
try:
urllib.urlretrieve(url, os.path.join(destination_directory, filename))
print 'Downloading File [%s]' % (filename)
except:
print 'Error Downloading File [%s]' % (filename)
def _download_all(main_url):
"""
Given a URL list, this method will download each file in the destination
directory.
"""
url_list = _create_url_list(main_url)
for url in url_list:
_download_file(url, _get_file_name(url))
谢谢,