globalMap.get() is not working

Seven Stars

globalMap.get() is not working

I am using a global map variable in a tJavaRow component. but its giving an error message.

"cannot make a static reference to the non-static field".

I am using BigData Spark job.

How Do i overcome this issue.



Ten Stars

Re: globalMap.get() is not working

Can you show your component code instead of the Code tab?
Seven Stars

Re: globalMap.get() is not working

I have created a simple mapping to show this error.





the error is coming from the tJavaRow component which i have shown in the first post.


here is the code of the tjavarow component .


 String[] values = (String[])globalMap.get("values");
 row1.a = row2.a;
 row1.b = row2.b;

here is the code in tjava component

String[] values = { "This", "is", "test" };

globalMap.put("values", values);


try reproducing the error in you system.


request to help me on this..



Six Stars

Re: globalMap.get() is not working

@Tom4sz, I tried the way you explained. For me it is working fine, i was able to get the values from globalMap in the tJavaRow.

I've exported my job (, you can download it and have a look.


Hope this helps.

- JG
Seven Stars

Re: globalMap.get() is not working


I am not able to open your job in talend.


what kind of job is that.. is it standard or bigdata job?.


I am specifically talking about Bigdata Spark job.


In Talend standard job , I am able to get the output . but the same code in talend big data is giving error.


I am using 'Talend big data plaform 6.2'



Six Stars

Re: globalMap.get() is not working

Sorry it was a standard job, not a Bigdata Spark job. I will give it a try with Bigdata Spark job.

- JG
Seven Stars

Re: globalMap.get() is not working

I am trying with an alternative


I have created a context variable of type Object.



To this variable I am saving my string array  in tJava component

String[] cols = { "This","is","test" };
context.SrcCols = cols;

I am accessing this context variable in tJavarow

String[] Cols = (String[])context.SrcCols;

but while running its giving the following error.

java.lang.ClassCastException: java.lang.String cannot be cast to [Ljava.lang.String;





Five Stars

Re: globalMap.get() is not working

Hello Tom,

Did you get any solution to make it work in BigData Spark Job? we tried to implement the same and found this artilce with the same error what we are getting.

Please suggest.



Re: globalMap.get() is not working



Currently, there is no access for "globalMap" in Spark Batch mode. This is due to a different implementation of Spark Batch compared to DI. You will only be able to accomplish this with standard DI jobs.

Please don't forget to kudo and accept as resolution if this resolves the issue.
Seven Stars

Re: globalMap.get() is not working

This feature is not available in talend spark jobs.

We have raised a ticket to talend support and they mentioned that it is a limitation in talend big data job.

I don't know if this feature is available in talend latest version



Four Stars

Re: globalMap.get() is not working

I need a globalmap or an alternative to this in spark jobs. Any ideas?

Five Stars

Re: globalMap.get() is not working

Hello Guys, I opened a ticket with Talend and Jack from talend already responded to this thread.. So we went and tried something different 


--> Create context variable solution like TOM suggested.

--> Instead of usting tJavaRow, use tJava then you will not get the casting error.

--> your job is somewaht like source data component --> Row (Main) --> tJava

--> I know tJava we should not use for main flow however Spark dataframe processing is different than actual main row flow in other talend jobs.

--> with this solution we are able to make Spark job work but we couldn't take this solution to production as we are not convinced to use tJava like this.


Introduction to Talend Open Studio for Data Integration.

Definitive Guide to Data Integration

Practical steps to developing your data integration strategy.

Definitive Guide to Data Quality

Create systems and workflow to manage clean data ingestion and data transformation.