我正在使用 Klepto 存档来索引文件夹树中的文件规范。扫描树后,我想快速删除对已删除文件的引用。但简单地从文件存档中逐一删除项目是非常缓慢的。有没有办法将更改同步到存档,或一次删除多个密钥?(“同步”方法似乎只添加新项目)
@Mike Mckerns 对这个问题的有用回答仅涉及删除单个项目: Python Saving and Editing with Klepto
使用 files.sync() 或 files.dump() 似乎只是从缓存中追加数据,而不是同步删除。有没有办法从缓存中删除密钥,然后一次性同步这些更改。个别删除太慢了。
这是一个工作示例:
from klepto.archives import *
import os
class PathIndex:
def __init__(self,folder):
self.folder_path=folder
self.files=file_archive(self.folder_path+'/.filespecs',cache=False)
self.files.load() #load memory cache
def list_directory(self):
self.filelist=[]
for folder, subdirs, filelist in os.walk(self.folder_path): #go through every subfolder in a folder
for filename in filelist: #now through every file in the folder/subfolder
self.filelist.append(os.path.join(folder, filename))
def scan(self):
self.list_directory()
for path in self.filelist:
self.update_record(path)
self.files.dump() #save to file archive
def rescan(self):
self.list_directory() #rescan original disk
deletedfiles=[]
#code to ck for modified files etc
#check for deleted files
for path in self.files:
try:
self.filelist.remove(path) #self.filelist - disk files - leaving list of new files
except ValueError:
deletedfiles.append(path)
#code to add new files, the files left in self.filelist
for path in deletedfiles:
self.delete_record(path)
#looking to here sync modified index from modifed to disk
def update_record(self,path):
self.files[path]={'size':os.path.getsize(path),'modified':os.path.getmtime(path)}
#add other specs - hash of contents etc.
def delete_record(self,path):
del(self.files[path]) #delete from the memory cache
#this next line slows it all down
del(self.files.archive[path]) #delete from the disk cache
#usage
_index=PathIndex('/path/to/root')
_index.scan()
#delete, modify some files
_index.rescan()