One Star

Getting error while running the job exceeding the 65535 bytes limit

Hi Team,
Does any one know ,why am getting below error when running the job.Please provide the solution.If you faced previously and solved same thing in your project.
Exception in thread "main" java.lang.Error: Unresolved compilation problem:
The code of method tFileList_1Process(Map<String,Object>) is exceeding the 65535 bytes limit
9 REPLIES
Seventeen Stars

Re: Getting error while running the job exceeding the 65535 bytes limit

You have 2 problems:
at first you are not familiar with the help pages from Talend (there is an article about that issue)
you need to redesign your job into smaller subjobs (A subjob is what is highlighted with a blue background). One of them in your job is simply to large.
Talend is a Java code generator and one subjob will be translated into one Java method. Java has a limitation of the generated byte code of one method (65335 bytes - not size of source code!). Normally no developer ever would reach that limit with hand made code but in Talend you can do very complex and huge things by using the job designer and that causes this problem.
Try to remove tLogRows and every thing whats not absolute necessary.
Could you post a screenshot of your job? We could start making suggestions how to separate.
One Star

Re: Getting error while running the job exceeding the 65535 bytes limit

Sorry jlolling.Am new to Talend and i don't know how to use it.
Could you please guide me how to use Help tools in Talend.
Regarding my issue could you please suggest me how to redesign my current job..
As of now my requirement is I have multiple similar XML files and i need to load data into 15 tables in oracle.
I have a parent and child relationship in my scenario..
I have attached my current job design with this post.It would really great if you suggest me how to redesign my current job into subjobs.
Regarding data storage,if the record is already exist i need to do update or else i need to insert data.
So in the settings section for tOracleOutput tables i have selected option as "Default" for Action on table and "Update or Insert" for "Action on data" in the Basic settings tab.
Let me know if you need any more details..
Currently it is stopping my current work.
Seventeen Stars

Re: Getting error while running the job exceeding the 65535 bytes limit

At first I would suggest you check if you read only the columns from your lookup table which are needed.
Leaf out everything whats not necessary.
If this does not solve your problem please erase the half if your lookups and save the result into a temporary file.
After that read that temporary file and join the other half of lookups and write it now in your target.
Here is the entry to the help pages:
https://help.talend.com?content-lang=en
One Star

Re: Getting error while running the job exceeding the 65535 bytes limit

I tried to remove the unwanted columns and it solved little bit..
But ongoing i might be adding multiple columns to my project..So am thinking that would cause a problem at any point of time.
Its not only the input fields but also out put columns it might be increasing more and more..
I have around 15 to 18 output tables..
So i would like to go break my project into multiple subJobs as suggested by you earlier..
Could you please tell how do i break it and and link it together..Currently ,do you have any sample examples explained to other developers..So that i can see and try to do small POC..
Currently it is stopping our design and not able to move forward..
Six Stars

Re: Getting error while running the job exceeding the 65535 bytes limit

Hi,
We had similar scenario and identical error.
What we did is:
- First take all the relevant data from XML files (iteratively if possible) into a single if not possible more staging table(s).
- Use the above staging table(s) as source for the subsequent subjobs loading respective oracle tables, in your case it might be 15-18 tables (=subjobs).
Thanks,
Jagadish.
Six Stars

Re: Getting error while running the job exceeding the 65535 bytes limit

Hi,
Attached some screenshots just for your guidance and overview to proceed further.
Don't try to read the image closely as they are not very legible.
Thanks,
Jagadish.
One Star

Re: Getting error while running the job exceeding the 65535 bytes limit

Thanks Jagadish for sharing the details..
From your diagram am not able to visualize much due the quality of image..
can you provide the data flow what components you have used and how you have designed like..
tFileList-> XMLFile->tMap-> <Source Table>
I have 50-60 XML elements in my file..So you want me to store all the data into one table and from there i have to store to target tables?
And based on data i will be creating sequence keys in each table..
Can you provide some details about subJobs..In your case..
For each target table do i need to create one subJob?
If possible can you uplaod the good quality picture?
One Star

Re: Getting error while running the job exceeding the 65535 bytes limit

Hello mcgovardhan - take a look at the suggestion I posted on http://www.talendforge.org/forum/viewtopic.php?pid=134368#p134368
which necessitates breaking up the job in multiple steps - one for each table in order to handle your sequence-based key fields...
One Star

Re: Getting error while running the job exceeding the 65535 bytes limit

Willm can you provide the data flow..
I mean something like high level design, where i need to break it into multiple jobs?