亚马逊不通过其 API 提供此功能。我们遇到了同样的问题,并通过运行将文件重新上传到 Glacier 的每日 cron 作业解决了这个问题。
下面是一段代码,您可以使用 Python 和 boto 运行以将文件复制到 Glacier 保险库。请注意,使用下面的代码,您必须先从 S3 本地下载文件,然后才能运行它(例如,您可以使用 s3cmd) - 以下代码对于将本地文件上传到 Glacier 很有用。
import boto
# Set up your AWS key and secret, and vault name
aws_key = "AKIA1234"
aws_secret = "ABC123"
glacierVault = "someName"
# Assumption is that this file has been downloaded from S3
fileName = "localfile.tgz"
try:
# Connect to boto
l = boto.glacier.layer2.Layer2(aws_access_key_id=aws_key, aws_secret_access_key=aws_secret)
# Get your Glacier vault
v = l.get_vault(glacierVault)
# Upload file using concurrent upload (so large files are OK)
archiveID = v.concurrent_create_archive_from_file(fileName)
# Append this archiveID to a local file, that way you remember what file
# in Glacier corresponds to a local file. Glacier has no concept of files.
open("glacier.txt", "a").write(fileName + " " + archiveID + "\n")
except:
print "Could not upload gzipped file to Glacier"