One Star

Exception in component tSparkConnection_1 java.io.FileNotFoundExceptio

Hi,
I am getting an error when trying to establish spark connection.
Starting job SparkDemoJob at 11:42 01/09/2015.
connecting to socket on port 4010
connected
: org.apache.spark.SecurityManager - Changing view acls to: GRM5KOR
: org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(GRM5KOR)
: akka.event.slf4j.Slf4jLogger - Slf4jLogger started
: Remoting - Starting remoting
: Remoting - Remoting started; listening on addresses :
: Remoting - Remoting now listens on addresses:
: org.apache.spark.SparkEnv - Registering MapOutputTracker
: org.apache.spark.SparkEnv - Registering BlockManagerMaster
: org.apache.spark.storage.DiskBlockManager - Created local directory at C:\Users\GRM5KOR\AppData\Local\Temp\spark-local-20150901114228-6989
: org.apache.spark.storage.MemoryStore - MemoryStore started with capacity 546.3 MB.
: org.apache.spark.network.ConnectionManager - Bound socket to port 60526 with id = ConnectionManagerId(BMHE1056048.BMH.APAC.BOSCH.COM,60526)
: org.apache.spark.storage.BlockManagerMaster - Trying to register BlockManager
: org.apache.spark.storage.BlockManagerInfo - Registering block manager BMHE1056048.BMH.APAC.BOSCH.COM:60526 with 546.3 MB RAM
: org.apache.spark.storage.BlockManagerMaster - Registered BlockManager
: org.apache.spark.HttpServer - Starting HTTP Server
: org.eclipse.jetty.server.Server - jetty-8.y.z-SNAPSHOT
: org.eclipse.jetty.server.AbstractConnector - Started SocketConnector@0.0.0.0:60527
: org.apache.spark.broadcast.HttpBroadcast - Broadcast server started at
: org.apache.spark.HttpFileServer - HTTP File server directory is C:\Users\GRM5KOR\AppData\Local\Temp\spark-8266001c-03e7-40c4-9c5f-eb4895fcd977
: org.apache.spark.HttpServer - Starting HTTP Server
: org.eclipse.jetty.server.Server - jetty-8.y.z-SNAPSHOT
: org.eclipse.jetty.server.AbstractConnector - Started SocketConnector@0.0.0.0:60528
: org.eclipse.jetty.server.Server - jetty-8.y.z-SNAPSHOT
: org.eclipse.jetty.server.AbstractConnector - Started SelectChannelConnector@0.0.0.0:4040
: org.apache.spark.ui.SparkUI - Started SparkUI at
: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
: org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
 at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:324)
 at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:339)
 at org.apache.hadoop.util.Shell.<clinit>(Shell.java:332)
 at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
 at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93)
 at org.apache.hadoop.security.Groups.<init>(Groups.java:77)
 at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
 at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:256)
 at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:284)
 at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:36)
 at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:109)
 at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
 at org.apache.spark.SparkContext.<init>(SparkContext.scala:228)
 at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:549)
 at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
 at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:130)
 at sparkprojecttalend.sparkdemojob_0_1.SparkDemoJob.tSparkConnection_1Process(SparkDemoJob.java:666)
 at sparkprojecttalend.sparkdemojob_0_1.SparkDemoJob.runJobInTOS(SparkDemoJob.java:1023)
 at sparkprojecttalend.sparkdemojob_0_1.SparkDemoJob.main(SparkDemoJob.java:880)
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087951886
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087953230
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087953754
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087954321
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087954818
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087954979
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087955401
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087955966
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087956631
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087957092
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087957256
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087957623
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087958014
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087958280
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087958434
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087958445
: org.apache.spark.SparkContext - Added JAR at with timestamp 1441087958656
Exception in component tSparkConnection_1
java.io.FileNotFoundException: \Users\GRM5KOR\AppData\Local\Temp\routines_SPARKPROJECTTALEND_SparkDemoJob_1.jar (The system cannot find the path specified)
 at java.io.FileInputStream.open(Native Method)
 at java.io.FileInputStream.<init>(Unknown Source)
 at com.google.common.io.Files$FileByteSource.openStream(Files.java:124)
 at com.google.common.io.Files$FileByteSource.openStream(Files.java:114)
 at com.google.common.io.ByteSource.copyTo(ByteSource.java:202)
 at com.google.common.io.Files.copy(Files.java:436)
 at org.apache.spark.HttpFileServer.addFileToDir(HttpFileServer.scala:62)
 at org.apache.spark.HttpFileServer.addJar(HttpFileServer.scala:57)
 at org.apache.spark.SparkContext.addJar(SparkContext.scala:944)
 at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:265)
 at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:265)
 at scala.collection.immutable.List.foreach(List.scala:318)
 at org.apache.spark.SparkContext.<init>(SparkContext.scala:265)
disconnected
 at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:549)
 at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
 at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:130)
 at sparkprojecttalend.sparkdemojob_0_1.SparkDemoJob.tSparkConnection_1Process(SparkDemoJob.java:666)
 at sparkprojecttalend.sparkdemojob_0_1.SparkDemoJob.runJobInTOS(SparkDemoJob.java:1023)
 at sparkprojecttalend.sparkdemojob_0_1.SparkDemoJob.main(SparkDemoJob.java:880)
Job SparkDemoJob ended at 11:42 01/09/2015.
The error log in short is due to file not found
Exception in component tSparkConnection_1
java.io.FileNotFoundException: \Users\GRM5KOR\AppData\Local\Temp\routines_SPARKPROJECTTALEND_SparkDemoJob_1.jar
Please help me resolving this.
2 REPLIES
One Star

Re: Exception in component tSparkConnection_1 java.io.FileNotFoundExceptio

I think I could figure it out myself.
I had to change the jvm temp location to a different location as below using tjava component and I was able to connect to spark.
One Star

Re: Exception in component tSparkConnection_1 java.io.FileNotFoundExceptio

code used in tjava
System.setProperty("java.io.tmpdir", "d:/temp");