Executing Spark Job 2.0 fails on local machine

One Star

Executing Spark Job 2.0 fails on local machine

i have an job with Cassandra Connection being transformed as an Spark job
while the job executes sucessfully with Spark 1.6 (Deafult) , it fails with Spark 2.0 with the below error message
could you please let me know what could be the extra libraries that could be missing out here ?
environment :
Talend Version : 6.2.1
Spark version : 2.x
Error Message
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/api/java/JavaSparkContext
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
        at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
        at java.lang.Class.getMethod0(Class.java:3018)
        at java.lang.Class.getMethod(Class.java:1784)
        at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
        at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.api.java.JavaSparkContext
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 7 more
Moderator

Re: Executing Spark Job 2.0 fails on local machine

Hi,
So far, there is no support of Spark 2.0 in V 6.2.1. 
You need to bring the jars and to load them in the Spark configuration with a Custom distribution for Spark 2.0.
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Four Stars

Re: Executing Spark Job 2.0 fails on local machine

Hi
I am getting below error, when I am running spark job
Can you give me what jar we need to add.
environment :
Talend Version : 6.2.1
Spark version : 1.6
ava.lang.NoClassDefFoundError: org/apache/spark/api/java/JavaSparkContext
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2693)
at java.lang.Class.privateGetMethodRecursive(Class.java:3040)
at java.lang.Class.getMethod0(Class.java:3010)
at java.lang.Class.getMethod(Class.java:1776)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.api.java.JavaSparkContext
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
Exception in thread "main" 
Seven Stars

Re: Executing Spark Job 2.0 fails on local machine

You are seeing this with local run?  Go to Help-> Installation Details -> Plug-ins and make sure you have all the spark libraries needed for that version.

What’s New for Talend Spring ’19

Watch the recorded webinar!

Watch Now

Definitive Guide to Data Quality

Create systems and workflow to manage clean data ingestion and data transformation.

Download

Tutorial

Introduction to Talend Open Studio for Data Integration.

Watch

Downloads and Trials

Test drive Talend's enterprise products.

Downloads