Five Stars

unable to build the job


I have removed few columns from the existing mapping and propagated the attributes to target. I am able to run the job successfully, however I am unable to build the job. while building the job I donot get any errors. However when I go the folder that I have provided to build the job. I don't see any .ZIP file being created. So please let me know why I am unable to build the job. Please find the screenshot as attached for the build options I am using as below





Durga Devi

  • Data Integration
Five Stars

Re: unable to build the job

I checked log file @ C:\Jaspersoft\6.0.1\studio\workspace\.metadata and saw that no error message were logged.

However when I ran the job in debug mode, it gave me an error "code limt is exceeding the 65535 bytes limit There may be some other errors caused by JVM compatibility. Make sure your JVM setup is similar to the studio". I have below queries

1. I am able to execute the job successfully. So while executing why I am not getting this error?

2. While building job no message is shown in the log file. screen shot is given below

3. In my mapping i simply read data from source and write to target. What are the factors that determine the code limit? does number of columns and its data type also determine code limit? 




Re: unable to build the job



A subjob generally generates at least 1 function in Talend.  If you inspect the code Talend generates, each subjob will generate one or more functions.  

If you have a subjob that has:

1) Input/Output/Tmap components with many  columns

2) Many different components chained together in such a way that the schema from one component to another keep changing, i.e. like having 5 tMaps all with different schemas

3) Having tJava, tJavaFlex or tJavaRow components with lots of code

4) Having tMaps with huge Java expressions


Then, the combinations of all the above may lead to the 1 function generated for the subjob to reach the 65535 bytes limit.


When you run a job in debug mode, Talend will potentially generate additional code for debug.  If you switch the log4j from Info to Debug, then Talend adds additional code to log the right debug information.  Thus, in debug mode, you may be generating more code per column, tmap etc, and hence go over the 65535 bytes limit while just running the job, you do not.