2

我们的存储区遇到了 SMB 连接问题,现在我们被迫使用 FTP 定期访问文件。因此,我没有使用 Bash,而是尝试使用 python,但遇到了一些问题。该脚本需要递归搜索 FTP 目录并找到所有超过 24 小时的文件“*1700_m30.mp4”。然后将所有这些文件复制到本地。

这是我到目前为止所得到的——但我似乎无法获取脚本来下载文件或从文件中获取统计信息来告诉我它们是否比 24 小时新。

#!/usr/bin/env python
# encoding: utf-8

import sys
import os
import ftplib
import ftputil
import fnmatch
import time

dir_dest = '/Volumes/VoigtKampff/Temp/TEST1/' # Directory where the files needs to be downloaded to
pattern = '*1700_m30.mp4' #filename pattern for what the script is looking for 
print 'Looking for this pattern :', pattern # print pattern


print "logging into GSP" # print 
host = ftputil.FTPHost('xxx.xxx','xxx','xxxxx') # ftp host info
recursive = host.walk("/GSPstor/xxxxx/xxx/xxx/xxx/xxxx",topdown=True,onerror=None) # recursive search 
for root,dirs,files in recursive:
    for name in files:
        print 'Files   :', files # print all files it finds
        video_list = fnmatch.filter(files, pattern)
        print 'Files to be moved :', video_list # print list of files to be moved 
        if host.path.isfile(video_list): # check whether the file is valid 
            host.download(video_list, video_list, 'b') # download file list 



host.close  

这是根据 ottomeister 的出色建议修改后的脚本(谢谢!!) - 现在的最后一个问题是它会下载,但它会继续下载文件并覆盖现有文件:

import sys
import os
import ftplib
import ftputil
import fnmatch
import time
from time import mktime
import datetime
import os.path, time 
from ftplib import FTP


dir_dest = '/Volumes/VoigtKampff/Temp/TEST1/' # Directory where the files needs to be downloaded to
pattern = '*1700_m30.mp4' #filename pattern for what the script is looking for 
print 'Looking for this pattern :', pattern # print pattern
utc_datetime_less24H = datetime.datetime.utcnow()-datetime.timedelta(seconds=86400) #UTC time minus 24 hours in seconds
print 'UTC time less than 24 Hours is: ', utc_datetime_less24H.strftime("%Y-%m-%d %H:%M:%S") # print UTC time minus 24 hours in seconds
print "logging into GSP FTP" # print 


with ftputil.FTPHost('xxxxxxxx','xxxxxx','xxxxxx') as host: # ftp host info
    recursive = host.walk("/GSPstor/xxxx/com/xxxx/xxxx/xxxxxx",topdown=True,onerror=None) # recursive search 
    for root,dirs,files in recursive:
        for name in files:
            print 'Files   :', files # print all files it finds
            video_list = fnmatch.filter(files, pattern) # collect all files that match pattern into variable:video_list
            statinfo = host.stat(root, video_list) # get the stats from files in variable:video_list
            file_mtime = datetime.datetime.utcfromtimestamp(statinfo.st_mtime) 
            print 'Files with pattern: %s and epoch mtime is: %s ' % (video_list, statinfo.st_mtime)
            print 'Last Modified: %s' % datetime.datetime.utcfromtimestamp(statinfo.st_mtime) 
            if file_mtime >= utc_datetime_less24H: 
                for fname in video_list:
                    fpath = host.path.join(root, fname)
                    if host.path.isfile(fpath):
                        host.download_if_newer(fpath, os.path.join(dir_dest, fname), 'b') 

host.close()
4

1 回答 1

5

这一行:

    video_list = fnmatch.filter(files, pattern)

为您提供与您的 glob 模式匹配的文件名列表。但是这一行:

    if host.path.isfile(video_list): # check whether the file is valid 

是假的,因为host.path.isfile()不希望文件名列表作为其参数。它需要一个路径名。因此,您需要一次迭代video_list构造一个路径名,将这些路径名中的每一个传递给host.path.isfile(),然后可能下载该特定文件。像这样的东西:

    import os.path

    for fname in video_list:
        fpath = host.path.join(root, fname)
        if host.path.isfile(fpath):
            host.download(fpath, os.path.join(dir_dest, fname), 'b')

请注意,我host.path.join()用于管理远程路径名和os.path.join()管理本地路径名。另请注意,这会将所有下载的文件放入一个目录中。如果要将它们放入镜像远程布局的目录层次结构中(如果不同远程目录中的文件名可能发生冲突,则必须执行类似操作),那么您需要构建不同的目标路径,并且您可能还必须创建本地目标目录层次结构。

要获取时间戳信息,请使用host.lstat()host.stat()取决于您要如何处理符号链接。

是的,那应该是host.close()。没有它,连接将在host变量超出范围并被垃圾收集后关闭,但最好显式关闭它。更好的是,使用一个with子句确保连接关闭,即使异常导致此代码在到达host.close()调用之前被放弃,如下所示:

    with ftputil.FTPHost('xxx.xxx','xxx','xxxxx') as host: # ftp host info
        recursive = host.walk(...)
        ...
于 2012-06-26T17:34:54.000 回答