If you are struggling with the error message in title of the post check if you are controlling ports that Spark needs. I have experienced that if the ports Spark is using can not be reached, YARN is going to terminate with the error message in the title. So it is best to control Spark ports and open them so that the YARN application would go through. More on Spark and networking here.
Spark chooses random ports and unless you have ALL ports open, you might run into the “endless”
INFO Client: Application report for application_1470560331181_0013 (state: ACCEPTED)
which eventually fails
INFO Client: Application report for application_1470560331181_0013 (state: FAILED)
and the error message returned would be
ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
Adding something like this in spark-defaults.conf
spark.blockManager.port 38000 spark.broadcast.port 38001 spark.driver.port 38002 spark.executor.port 38003 spark.fileserver.port 38004 spark.replClassServer.port 38005
could solve this issue.
My notes on installing Spark 2.0 are here.
And how to install Spark 1.6 is described here.