One Star

Connecting from Talend to Spark 1.6.0 Cluster

I have created a job in talend to connect to a Spark 1.6.0 Cluster.
When running the job from Talend an error is thrown.
Spakr 1.6 , spark-mode : standalone, distrubition : custom
Any help on this would be highly appreciated since this issue is a real showstopper !!
java.lang.NoClassDefFoundError: org/apache/spark/scheduler/SparkListener
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(
at java.lang.Class.privateGetMethodRecursive(
at java.lang.Class.getMethod0(
at java.lang.Class.getMethod(
at sun.launcher.LauncherHelper.validateMainClass(
at sun.launcher.LauncherHelper.checkAndLoadMain(
Caused by: java.lang.ClassNotFoundException: org.apache.spark.scheduler.SparkListener
at java.lang.ClassLoader.loadClass(
at sun.misc.Launcher$AppClassLoader.loadClass(
at java.lang.ClassLoader.loadClass(
... 7 more
Error: A JNI error has occurred, please check your installation and try again

Exception in thread "main"

Re: Connecting from Talend to Spark 1.6.0 Cluster

Could you please indicate the build version you are using? Are you using jdk 1.8? Have you ever executed your other jobs in your Talend studio successfully?
Screenshots of job setting will be preferred.
Best regards
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
One Star

Re: Connecting from Talend to Spark 1.6.0 Cluster

I think you need just to configure the job to use Spark.
Go to "Run" tab -> Spark configuration -> load the properties from the repository ( that i presume you created under metadata as hadoop cluster )