我正在尝试在 Google Compute Engine 上设置 Hadoop 集群,并且我一直在按照这些说明进行操作。在我运行之前,一切似乎都运行良好:
./compute_cluster_for_hadoop.py setup <project ID> <bucket name>
使用我创建的项目 ID 和存储桶名称。该脚本似乎无法访问某些内容并以 403 崩溃;这是带有错误消息的输出的结尾:
Uploading ...kages/ca-certificates-java_20121112+nmu2_all.deb: 14.57 KB/14.57 KB
Uploading ...duce/tmp/deb_packages/libnspr4_4.9.2-1_amd64.deb: 316 B/316 B
Uploading ...e/tmp/deb_packages/libnss3-1d_3.14.3-1_amd64.deb: 318 B/318 B
Uploading ...dk-6-jre-headless_6b27-1.12.6-1~deb7u1_amd64.deb: 366 B/366 B
Uploading ...duce/tmp/deb_packages/libnss3_3.14.3-1_amd64.deb: 315 B/315 B
ResumableUploadAbortException: 403 Forbidden
AccessDeniedException: 403 Forbidden
AccessDeniedException: 403 Forbidden
AccessDeniedException: 403 Forbidden
AccessDeniedException: 403 Forbidden
ResumableUploadAbortException: 403 Forbidden
AccessDeniedException: 403 Forbidden
CommandException: 7 files/objects could not be transferred.
########## ERROR ##########
Failed to copy Hadoop and Java packages to Cloud Storage gs://<bucket name>/mapreduce/tmp/
###########################
Traceback (most recent call last):
File "./compute_cluster_for_hadoop.py", line 230, in <module>
main()
File "./compute_cluster_for_hadoop.py", line 226, in main
ComputeClusterForHadoop().ParseArgumentsAndExecute(sys.argv[1:])
File "./compute_cluster_for_hadoop.py", line 222, in ParseArgumentsAndExecute
params.handler(params)
File "./compute_cluster_for_hadoop.py", line 36, in SetUp
gce_cluster.GceCluster(flags).EnvironmentSetUp()
File "/Path/To/solutions-google-compute-engine-cluster-for-hadoop/gce_cluster.py", line 149, in EnvironmentSetUp
raise EnvironmentSetUpError('Environment set up failed.')
gce_cluster.EnvironmentSetUpError: Environment set up failed.