amazon ec2 - How do I upload my uber-jar to a spark cluster created using the spark-ec2 scripts? -


i create ec2 spark cluster in 1 line of code using spark_home/ec2/spark-ec2

i'm using --copy-aws-credentials (spark 1.2.0) sc.textfile("s3...") works well

my issue silly, how copy jar master?

  1. aws cli doesn't seem setup correctly can't use s3 directly.
  2. i can make jar public , wget bad practice.
  3. i tried connect master outside, it's blocked

what best practice submitting uber jar spark standalone amazon ec2 cluster launched via ec2/spark-ec2?


Comments

Popular posts from this blog

java - Custom OutputStreamAppender not run: LOGBACK: No context given for <MYAPPENDER> -

c++ - No viable overloaded operator for references a map -

java - UML - How would you draw a try catch in a sequence diagram? -