0

Hi I have been able to setup successfully a Spark Cluster on AWS EC2 for 2 ongoing months but recently I started getting the following error in the creation script. It's basically failing in setting up the Scala packages and not resolving the source S3 endpoint:

--2017-02-28 17:51:30--  (try: 6)  http://s3.amazonaws.com/spark-related-packages/scala-2.10.3.tgz
Connecting to s3.amazonaws.com (s3.amazonaws.com)|52.216.0.83|:80... failed: Connection timed out.
Retrying.

This is my source Spark version in Github

https://github.com/amplab/spark-ec2/archive/branch-2.0.zip

And the above Scala error comes from the init.sh in

spark-ec2/scala/init.sh

Can someone fix that S3 endpoint in the Github directory or is it no longer supported from the open-source Spark community?

4

1 回答 1

1

Amazon AWS 今天面临巨大的中断,主要影响 S3 服务。检查https://status.aws.amazon.com/了解新闻。

于 2017-02-28T20:15:34.583 回答