Four Stars

Sandbox installed but demo won't run

When trying to run step 1b I get the following errors

 

Starting job Step_1b_Recommendation_DemoSetup at 05:02 25/06/2018.

[statistics] connecting to socket on port 3755
[statistics] connected
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in component tCassandraConnection_1
java.lang.IllegalArgumentException: cassandra0.weave.local: Name or service not known
	at com.datastax.driver.core.Cluster$Builder.addContactPoint(Cluster.java:839)
	at com.datastax.driver.core.Cluster$Builder.addContactPoints(Cluster.java:859)
	at base_project.step_1b_recommendation_demosetup_0_1.Step_1b_Recommendation_DemoSetup.tCassandraConnection_1Process(Step_1b_Recommendation_DemoSetup.java:574)
	at base_project.step_1b_recommendation_demosetup_0_1.Step_1b_Recommendation_DemoSetup.runJobInTOS(Step_1b_Recommendation_DemoSetup.java:29250)
	at base_project.step_1b_recommendation_demosetup_0_1.Step_1b_Recommendation_DemoSetup.main(Step_1b_Recommendation_DemoSetup.java:29084)
Exception in component tCassandraOutput_1
java.lang.NullPointerException
	at base_project.step_1b_recommendation_demosetup_0_1.Step_1b_Recommendation_DemoSetup.tFileInputDelimited_1Process(Step_1b_Recommendation_DemoSetup.java:11281)
	at base_project.step_1b_recommendation_demosetup_0_1.Step_1b_Recommendation_DemoSetup.runJobInTOS(Step_1b_Recommendation_DemoSetup.java:29262)
[FATAL]: base_project.step_1b_recommendation_demosetup_0_1.Step_1b_Recommendation_DemoSetup - tCassandraConnection_1 cassandra0.weave.local: Name or service not known
[FATAL]: base_project.step_1b_recommendation_demosetup_0_1.Step_1b_Recommendation_DemoSetup - tCassandraOutput_1 null
[statistics] disconnected
	at base_project.step_1b_recommendation_demosetup_0_1.Step_1b_Recommendation_DemoSetup.main(Step_1b_Recommendation_DemoSetup.java:29084)
Job Step_1b_Recommendation_DemoSetup ended at 05:02 25/06/2018. [exit code=1]

Please help

Tags (1)
1 ACCEPTED SOLUTION

Accepted Solutions
Employee

Re: Sandbox installed but demo won't run

Hello Kenton,

 

I tried on my end and it's working fine.

 

However, looking at the error, it looks like the Cassandra table is not created. My guess is that when you have followed the instructions to fix the issue with the MySQL container you forgot to re-run the first steps of the demo (step_1a and step_1b). So make sure the Cassandra table is effectively created (in step_1b).

 

Hope it helps

4 REPLIES
Four Stars

Re: Sandbox installed but demo won't run

Following the instructions here: Talend-Sandbox-examples-stop-working-continues-delivery-issue

 

Nearly all errors are now resolved

 

Starting job Step_1b_Recommendation_DemoSetup at 23:52 25/06/2018.
[statistics] connecting to socket on port 3709 [statistics] connected SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. [statistics] disconnected Job Step_1b_Recommendation_DemoSetup ended at 23:53 25/06/2018. [exit code=0]

 

Is this something I need to be concerned about?

 

Four Stars

Re: Sandbox installed but demo won't run

Now step 3 fails:

Starting job Step_3_Recommendation_Build_Model_Spark at 04:11 26/06/2018.

[statistics] connecting to socket on port 3827
[statistics] connected
[WARN ]: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[WARN ]: org.apache.spark.SparkConf - In Spark 1.0 and later spark.local.dir will be overridden by the value set by the cluster manager (via SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN).
[WARN ]: org.apache.spark.util.Utils - Your hostname, talend resolves to a loopback address: 127.0.1.1; using 192.168.150.128 instead (on interface ens32)
[WARN ]: org.apache.spark.util.Utils - Set SPARK_LOCAL_IP if you need to bind to another address
java.io.IOException: Couldn't find demo.UserLookupInfo or any similarly named keyspace and table pairs
	at com.datastax.spark.connector.cql.Schema$.tableFromCassandra(Schema.scala:348)
	at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.tableDef(CassandraTableRowReaderProvider.scala:50)
	at com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef$lzycompute(CassandraTableScanRDD.scala:61)
	at com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef(CassandraTableScanRDD.scala:61)
	at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.verify(CassandraTableRowReaderProvider.scala:137)
	at com.datastax.spark.connector.rdd.CassandraTableScanRDD.verify(CassandraTableScanRDD.scala:61)
	at com.datastax.spark.connector.rdd.CassandraTableScanRDD.getPartitions(CassandraTableScanRDD.scala:231)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
	at scala.Option.getOrElse(Option.scala:120)
	at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
	at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
	at scala.Option.getOrElse(Option.scala:120)
	at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
	at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
	at scala.Option.getOrElse(Option.scala:120)
	at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
	at org.apache.spark.Partitioner$$anonfun$2.apply(Partitioner.scala:58)
	at org.apache.spark.Partitioner$$anonfun$2.apply(Partitioner.scala:58)
	at scala.math.Ordering$$anon$5.compare(Ordering.scala:122)
	at java.util.TimSort.countRunAndMakeAscending(TimSort.java:355)
	at java.util.TimSort.sort(TimSort.java:220)
	at java.util.Arrays.sort(Arrays.java:1438)
	at scala.collection.SeqLike$class.sorted(SeqLike.scala:615)
	at scala.collection.AbstractSeq.sorted(Seq.scala:40)
	at scala.collection.SeqLike$class.sortBy(SeqLike.scala:594)
	at scala.collection.AbstractSeq.sortBy(Seq.scala:40)
	at org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:58)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$cogroup$5.apply(PairRDDFunctions.scala:841)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$cogroup$5.apply(PairRDDFunctions.scala:841)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
	at org.apache.spark.rdd.PairRDDFunctions.cogroup(PairRDDFunctions.scala:840)
	at org.apache.spark.api.java.JavaPairRDD.cogroup(JavaPairRDD.scala:693)
	at org.talend.bigdata.dataflow.spark.batch.hmap.SparkHMapJoin.buildBatchJoin(SparkHMapJoin.java:71)
	at org.talend.bigdata.dataflow.spark.batch.hmap.SparkHMapJoin.build(SparkHMapJoin.java:207)
	at org.talend.bigdata.dataflow.spark.batch.hmap.SparkHMapJoin.build(SparkHMapJoin.java:41)
	at org.talend.bigdata.dataflow.DataFlowPipelineBuilder$PipelineFactory.buildPipelines(DataFlowPipelineBuilder.java:100)
	at org.talend.bigdata.dataflow.hmap.HMapDataFlowBuilder.build(HMapDataFlowBuilder.java:111)
	at base_project.step_3_recommendation_build_model_spark_0_1.Step_3_Recommendation_Build_Model_Spark.tFileInputDelimited_1_HDFSInputFormatProcess(Step_3_Recommendation_Build_Model_Spark.java:3133)
	at base_project.step_3_recommendation_build_model_spark_0_1.Step_3_Recommendation_Build_Model_Spark.run(Step_3_Recommendation_Build_Model_Spark.java:3680)
	at base_project.step_3_recommendation_build_model_spark_0_1.Step_3_Recommendation_Build_Model_Spark.runJobInTOS(Step_3_Recommendation_Build_Model_Spark.java:3556)
	at base_project.step_3_recommendation_build_model_spark_0_1.Step_3_Recommendation_Build_Model_Spark.main(Step_3_Recommendation_Build_Model_Spark.java:3444)
java.io.IOException: Couldn't find demo.UserLookupInfo or any similarly named keyspace and table pairs
	at com.datastax.spark.connector.cql.Schema$.tableFromCassandra(Schema.scala:348)
	at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.tableDef(CassandraTableRowReaderProvider.scala:50)
	at com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef$lzycompute(CassandraTableScanRDD.scala:61)
	at com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef(CassandraTableScanRDD.scala:61)
	at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.verify(CassandraTableRowReaderProvider.scala:137)
	at com.datastax.spark.connector.rdd.CassandraTableScanRDD.verify(CassandraTableScanRDD.scala:61)
	at com.datastax.spark.connector.rdd.CassandraTableScanRDD.getPartitions(CassandraTableScanRDD.scala:231)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
	at scala.Option.getOrElse(Option.scala:120)
[statistics] disconnected
	at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
	at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
	at scala.Option.getOrElse(Option.scala:120)
	at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
	at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
	at scala.Option.getOrElse(Option.scala:120)
	at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
	at org.apache.spark.Partitioner$$anonfun$2.apply(Partitioner.scala:58)
	at org.apache.spark.Partitioner$$anonfun$2.apply(Partitioner.scala:58)
	at scala.math.Ordering$$anon$5.compare(Ordering.scala:122)
	at java.util.TimSort.countRunAndMakeAscending(TimSort.java:355)
	at java.util.TimSort.sort(TimSort.java:220)
	at java.util.Arrays.sort(Arrays.java:1438)
	at scala.collection.SeqLike$class.sorted(SeqLike.scala:615)
	at scala.collection.AbstractSeq.sorted(Seq.scala:40)
	at scala.collection.SeqLike$class.sortBy(SeqLike.scala:594)
	at scala.collection.AbstractSeq.sortBy(Seq.scala:40)
	at org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:58)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$cogroup$5.apply(PairRDDFunctions.scala:841)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$cogroup$5.apply(PairRDDFunctions.scala:841)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
	at org.apache.spark.rdd.PairRDDFunctions.cogroup(PairRDDFunctions.scala:840)
	at org.apache.spark.api.java.JavaPairRDD.cogroup(JavaPairRDD.scala:693)
	at org.talend.bigdata.dataflow.spark.batch.hmap.SparkHMapJoin.buildBatchJoin(SparkHMapJoin.java:71)
[ERROR]: base_project.step_3_recommendation_build_model_spark_0_1.Step_3_Recommendation_Build_Model_Spark - TalendJob: 'Step_3_Recommendation_Build_Model_Spark' - Failed with exit code: 1.
	at org.talend.bigdata.dataflow.spark.batch.hmap.SparkHMapJoin.build(SparkHMapJoin.java:207)
	at org.talend.bigdata.dataflow.spark.batch.hmap.SparkHMapJoin.build(SparkHMapJoin.java:41)
	at org.talend.bigdata.dataflow.DataFlowPipelineBuilder$PipelineFactory.buildPipelines(DataFlowPipelineBuilder.java:100)
	at org.talend.bigdata.dataflow.hmap.HMapDataFlowBuilder.build(HMapDataFlowBuilder.java:111)
	at base_project.step_3_recommendation_build_model_spark_0_1.Step_3_Recommendation_Build_Model_Spark.tFileInputDelimited_1_HDFSInputFormatProcess(Step_3_Recommendation_Build_Model_Spark.java:3133)
	at base_project.step_3_recommendation_build_model_spark_0_1.Step_3_Recommendation_Build_Model_Spark.run(Step_3_Recommendation_Build_Model_Spark.java:3680)
	at base_project.step_3_recommendation_build_model_spark_0_1.Step_3_Recommendation_Build_Model_Spark.runJobInTOS(Step_3_Recommendation_Build_Model_Spark.java:3556)
	at base_project.step_3_recommendation_build_model_spark_0_1.Step_3_Recommendation_Build_Model_Spark.main(Step_3_Recommendation_Build_Model_Spark.java:3444)
Exception in thread "main" java.lang.RuntimeException: TalendJob: 'Step_3_Recommendation_Build_Model_Spark' - Failed with exit code: 1.
	at base_project.step_3_recommendation_build_model_spark_0_1.Step_3_Recommendation_Build_Model_Spark.main(Step_3_Recommendation_Build_Model_Spark.java:3454)
Job Step_3_Recommendation_Build_Model_Spark ended at 04:12 26/06/2018. [exit code=1]

Suggestions?

Employee

Re: Sandbox installed but demo won't run

Hello Kenton,

 

I tried on my end and it's working fine.

 

However, looking at the error, it looks like the Cassandra table is not created. My guess is that when you have followed the instructions to fix the issue with the MySQL container you forgot to re-run the first steps of the demo (step_1a and step_1b). So make sure the Cassandra table is effectively created (in step_1b).

 

Hope it helps

Four Stars

Re: Sandbox installed but demo won't run

Oops, didn't realize I needed to run from the start again.