Spark提交任务到Yarn并查看任务日志

想要更全面了解Spark内核和应用实战,可以购买我的新书。

图解Spark 大数据快速分析实战(作者:王磊)

  1. 任务提交
spark-submit --master  yarn --deploy-mode cluster  --class org.apache.spark.examples.SparkPi /usr/local/service/spark/examples/jars/spark-examples_2.12-3.0.2.jar 100

  1. 列出所有任务
yarn application -list -appStates ALL

  1. 找到对应任务查看日志。
Total number of applications (application-types: [], states: [NEW, NEW_SAVING, SUBMITTED, ACCEPTED, RUNNING, FINISHED, FAILED, KILLED] and tags: []):150
                Application-Id	    Application-Name	    Application-Type	      User	     Queue	             State	       Final-State	       Progress	                       Tracking-URL
application_1653303170860_0049	LOAD DATA LOCAL INPATH '/data/use...customer (Stage-1)	           MAPREDUCE	    hadoop	root.default	          FINISHED	            FAILED	           100%	http://xxxx:5024/jobhistory/job/job_1653303170860_0049

4 . 支持如果命令,查看applicationId对应的日志。

yarn logs -applicationId application_1653303170860_0049代码片

版权声明:本文为wangleigiser原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接和本声明。