Five Stars

The function POSIX.open() is not supported on Windows

Hi,

 

I've been trying to run a Spark job on Talend Data Fabric for the past few days, but am unable to due to an error.

I was originally going to create a simple classification Spark job, but the job failed to run. Tried simplifying the job to only using a tInputFileDelimited and tLogRow component but it still outputs the same error.

 

 

[statistics] connecting to socket on port 3551
[statistics] connected
[WARN ]: org.apache.spark.SparkConf - In Spark 1.0 and later spark.local.dir will be overridden by the value set by the cluster manager (via SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN).
java.lang.RuntimeException: /C:/tmp/spark-daaf35e8-507b-4e6c-abea-822492ee2b51/userFiles-85ae0f6c-7ab6-4339-abce-de14b8542220/Default.properties
	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadNativePermissionInfo(RawLocalFileSystem.java:717)
	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:654)
	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:630)
	at org.apache.hadoop.fs.permission.ChmodParser.applyNewPermission(ChmodParser.java:49)
	at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:912)
	at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:889)
	at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:863)
	at org.apache.spark.util.Utils$.fetchFile(Utils.scala:406)
	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1386)
	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1340)
	at org.apache.spark.api.java.JavaSparkContext.addFile(JavaSparkContext.scala:662)
[ERROR]: org.apache.hadoop.fs.FileSystem - Failed to fstat on: /C:/tmp/spark-daaf35e8-507b-4e6c-abea-822492ee2b51/userFiles-85ae0f6c-7ab6-4339-abce-de14b8542220/Default.properties
java.io.IOException: The function POSIX.open() is not supported on Windows
	at org.apache.hadoop.io.nativeio.NativeIO$POSIX.open(Native Method)
	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadNativePermissionInfo(RawLocalFileSystem.java:712)
	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:654)
	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:630)
	at org.apache.hadoop.fs.permission.ChmodParser.applyNewPermission(ChmodParser.java:49)
	at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:912)
	at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:889)
	at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:863)
	at org.apache.spark.util.Utils$.fetchFile(Utils.scala:406)
	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1386)
	at neww.awa_0_1.awa.setContext(awa.java:1452)
	at neww.awa_0_1.awa.run(awa.java:1148)
	at neww.awa_0_1.awa.runJobInTOS(awa.java:1122)
	at neww.awa_0_1.awa.main(awa.java:1007)
Caused by: java.io.IOException: The function POSIX.open() is not supported on Windows
	at org.apache.hadoop.io.nativeio.NativeIO$POSIX.open(Native Method)
	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadNativePermissionInfo(RawLocalFileSystem.java:712)
	... 14 more
Exception in thread "main" java.lang.RuntimeException: TalendJob: 'awa' - Failed with exit code: 1.
	at neww.awa_0_1.awa.main(awa.java:1017)
	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1340)
	at org.apache.spark.api.java.JavaSparkContext.addFile(JavaSparkContext.scala:662)
	at neww.awa_0_1.awa.setContext(awa.java:1452)
	at neww.awa_0_1.awa.run(awa.java:1148)
	at neww.awa_0_1.awa.runJobInTOS(awa.java:1122)
	at neww.awa_0_1.awa.main(awa.java:1007)
[ERROR]: neww.awa_0_1.awa - TalendJob: 'awa' - Failed with exit code: 1.

 

I'm using a MapR 5.2.0 cluster on multiple nodes with Spark 2.11.

Any ideas on what's causing the problem? Some input on this would be helpful.

 

2 REPLIES
Moderator

Re: The function POSIX.open() is not supported on Windows

Hello,

Would you mind posting your current spark job design screenshots into forum which will be helpful for us to address your issue?

Best regards

Sabrina

--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Five Stars

Re: The function POSIX.open() is not supported on Windows

Hi Sabrina,

 

Here's a screenshot of the Spark job I was trying to run:

sparkjob.png