'Exception while deleting Spark temp dir' logged when executing a Talend Spark Job in local mode

Talend Version       6.3.1

Summary

 
Additional Versions  
Product Big Data
Component Data Processing
Problem Description

When executing a Talend 6.3.1 Spark Job consisting of tFixedFlowInput and tFileOutputDelimited components in local mode, the following error/exception is thrown and logged:

[ERROR]: org.apache.spark.util.ShutdownHookManager - Exception while deleting Spark temp dir:
C:\tmp\spark\spark-8392ff21-b1a9-48a2-9849-110240d1138c\userFiles-29c597b8-3108-4191-a633-bb7999850fdd
java.io.IOException: Failed to delete: C:\tmp\spark\spark-8392ff21-b1a9-48a2-9849-110240d1138c\userFiles-29c597b8-3108-4191-a633-bb7999850fdd
	at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:884)
	at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:63)
	at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:60)
	at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
	at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:60)
	at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:264)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:234)

Note: despite this error, the Talend Spark Job executed successfully.

Problem root cause

A bug was filed against this issue, but it was closed as it turned out that this issue is due to a Spark bug. See https://issues.apache.org/jira/browse/SPARK-8333 for details.

Solution or Workaround Resolution of this issue depends on a resolution for Spark bug SPARK-8333. From Talend's perspective, the logged exception can be ignored, as there is no impact on execution of the Talend Job.
JIRA ticket number  
Version history
Revision #:
7 of 7
Last update:
‎04-30-2018 11:46 AM
Updated by:
 
Contributors
Tags (2)