spark异常处理java.util.concurrent.TimeoutException: Futures timed out after [100000 milliseconds]

今天写spark,在本地运行正常,在yarn上执行报了这么个异常,原因既然是我在把代码setMaster注释掉
java.util.concurrent.TimeoutException: Futures timed out after [100000 milliseconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53) at scala.concurrent.Await$.result(package.scala:107) at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:354) at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:197) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:695) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:69) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:68) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920) at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:68) at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:693) at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
注释掉就好了


版权声明:本文为qq_21198327原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接和本声明。