amazon ec2 - How do I upload my uber-jar to a spark cluster created using the spark-ec2 scripts? -
i create ec2 spark cluster in 1 line of code using spark_home/ec2/spark-ec2
i'm using --copy-aws-credentials
(spark 1.2.0) sc.textfile("s3...")
works well
my issue silly, how copy jar master?
- aws cli doesn't seem setup correctly can't use s3 directly.
- i can make jar public , wget bad practice.
- i tried connect master outside, it's blocked
what best practice submitting uber jar spark standalone amazon ec2 cluster launched via ec2/spark-ec2
?
Comments
Post a Comment