[resolved] The code of method exceeding the 65535 bytes limit

One Star

[resolved] The code of method exceeding the 65535 bytes limit

I am a new user of Talend, and I'm writing some jobs to load Legacy data into our data warehouse.
I have a tab delimited file with 140 columns & 886720 rows and that I need to process and load into one MySQL table using MySQL Bulk Output. I've attached a screenshot of my jobs for reference.
When I run this job, I get the following error:
Exception in thread "main" java.lang.Error: Unresolved compilation problem:
The code of method tFileInputDelimited_1Process(Map<String,Object>) is exceeding the 65535 bytes limit
I've already tried splitting the file into two to reduce the number of columns in each file, and then use a tJoin to merge the data back together before I process it. This also produces the same error message.
What can I do to get the size of that method down so this job will run?

We are using the Java version on Windows Server 2003 VM
Thanks!

Accepted Solutions
One Star

Re: [resolved] The code of method exceeding the 65535 bytes limit

I tried splitting the job like you explained and it seems to work great!
Thanks for all your help!

All Replies
One Star

Re: [resolved] The code of method exceeding the 65535 bytes limit

Hi,
This is a Java limitation, you can read an explanation here : http://www.talendforge.org/forum/viewtopic.php?id=2184
Regards
One Star

Re: [resolved] The code of method exceeding the 65535 bytes limit

I realize the Java limitation, but I was asking for some suggestions and hoping for a workaround.
This thread doesn't seem to address the first poster's question, only a subsequent poster that has too many tJobs. All that was said was to submit a bug to bugtracker. Please correct me if I'm wrong...
Was there a resolution to the first poster's question with regards to having problems with the tFileInputDelimited having too many columns? I can't think of any way to split this jobs into multiple jobs, since I am doing a bulk insert into MySQL from the 'massaged' data file.
Thanks!
Valerie
One Star

Re: [resolved] The code of method exceeding the 65535 bytes limit

Hi,
You can delete the tLogCatcher and the hdb_emp_basic_log and replace it by the native stats & logs option.
In your job you have some options to catch stats & logs in the Job tab : "Stats & Logs". You can save the logs on DB or on files.
You can also deactivate (or delete) the tLogRow_1 to have better performance.
Do you use the 140 columns?
One Star

Re: [resolved] The code of method exceeding the 65535 bytes limit

You can also try to split your Job in two jobs :
- First : replace and test schema compliance
- Second : add transformations and loading data
job1.png : HDB_OK_STEP2 = job2.png
Tell me if it resolves your problem.
Regards
One Star

Re: [resolved] The code of method exceeding the 65535 bytes limit

I tried splitting the job like you explained and it seems to work great!
Thanks for all your help!
One Star

Re: [resolved] The code of method exceeding the 65535 bytes limit

It doesnt work great. I split a job into two jobs after getting this error on tfiledelimited. I have about 450k rows with < 15 cols. This fixed nothing. I get the same error
the code of method tfileinputdelimited is exceeding the 65535 bytes limit
I even ran each job separately and still get the error. seems issue with # of rows; however, this is not a lot of rows for a parser

Re: [resolved] The code of method exceeding the 65535 bytes limit

sorry but if you are having problems parsing too many rows through the parser in one Run.
it would surely work if you split the rows into multiple Runs.
so it would seem that you have to split your input;
- about 10K, or 1K (whatever works)
run through your with the SchemaCheck - generating OK and Rejects files
and after all rows checked, then proceed
looping over your input/OK files
just a thought
One Star

Re: [resolved] The code of method exceeding the 65535 bytes limit

Thanks ~
but that defeats the purpose of using talend for automation (only talend snag so far - otherwise good stuff).
shouldn't there be a plugin that loops per row/slice (e.g. 10K) ... and being this is just read in info, why is it counted as part of the method size. splitting 1 or 2 files into multiple files (though still an issue) may be feasible. Splitting +10 files into multiple files is counter productive.
Seventeen Stars

Re: [resolved] The code of method exceeding the 65535 bytes limit

450k number of rows and 15 columns sounds not like a design what is to large for a Java method!
450k column - yes or 1000 column ok but 15?
If your problem is really caused by the number of columns - what is hard to beliefe read next:
One of the problem is, the component tFileInputDelimited does not use inner classes to reduce the byte code footprint of the method.
Alternatively you could use the component tFileInputTextFlat (Talend Exchange).
http://www.talendforge.org/exchange/index.php?eid=745&product=tos&action=view&nav=1,1,1
This component uses inner classes to reduce as much as possible the byte code size. One of the other advantages is, you do not have to read all fields (unlike the tFileInputDelimited) because the parser can skip over unwanted fields without the need of mapping them to schema columns. If you read a lot of fields only to have some fields at the end of one line, this is also a good choice.
Six Stars

Re: [resolved] The code of method exceeding the 65535 bytes limit

@jlolling

We do not finf the tFileInputTextFlat  componet for the Talend Open Source Bigdata - Exchange. 

Seventeen Stars

Re: [resolved] The code of method exceeding the 65535 bytes limit

Hi, the component is available in Talend Exchange:

https://exchange.talend.com/#marketplaceproductoverview:marketplace=marketplace%252F1&p=marketplace%...

Best regards

Jan Lolling