1

我正在尝试编写一个 Python 脚本,该脚本将爬过一个目录并找到所有重复的文件并报告重复项。解决这个问题最好的办法是什么?

import os, sys

def crawlDirectories(directoryToCrawl):
    crawledDirectory = [os.path.join(path, subname) for path, dirnames, filenames in os.walk(directoryToCrawl) for subname in dirnames + filenames]
    return crawledDirectory

#print 'Files crawled',crawlDirectories(sys.argv[1])

directoriesWithSize = {}
def getByteSize(crawledDirectory):
    for eachFile in crawledDirectory:
        size = os.path.getsize(eachFile)
        directoriesWithSize[eachFile] = size
    return directoriesWithSize

getByteSize(crawlDirectories(sys.argv[1]))

#print directoriesWithSize.values()

duplicateItems = {}

def compareSizes(dictionaryDirectoryWithSizes):
    for key,value in dictionaryDirectoryWithSizes.items():
        if directoriesWithSize.values().count(value) > 1:
            duplicateItems[key] = value

compareSizes(directoriesWithSize)

#print directoriesWithSize.values().count(27085)

compareSizes(directoriesWithSize)

print duplicateItems

为什么这会抛出这个错误?

Traceback (most recent call last):
  File "main.py", line 16, in <module>
    getByteSize(crawlDirectories(sys.argv[1]))
  File "main.py", line 12, in getByteSize
    size = os.path.getsize(eachFile)
  File     "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/genericpath.py", line 49, in getsize
OSError: [Errno 2] No such file or directory:        '../Library/Containers/com.apple.ImageKit.RecentPictureService/Data/Documents/iChats'
4

2 回答 2

0

在我看来,您的crawledDirectory功能太复杂了:

def crawlDirectories(directoryToCrawl):
    output = []
    for path, dirnames, filenames in os.walk(directoryToCrawl):
        for fname in filenames:
            output.append(os.path.join(path,fname))
    return output
于 2012-09-24T13:03:03.987 回答
0

我建议尝试:

def crawlDirectories(directoryToCrawl):
    crawledDirectory = [os.path.realpath(os.path.join(p, f)) 
                                         for (p, d, f) in os.walk(directoryToCrawl)]
return crawledDirectory

也就是说,在爬网中使用规范路径而不是相对路径。

于 2012-09-24T13:06:40.003 回答