I have created a very simple job to reproduce the scenario.
tfileinput_excel --> tlogrow
1.) tfileinputexcel: The input file has 1000 rows. Trying to read xlsx file with Event mode to reduce less memory. Tried to limit the row from the file to 50.
When I try to run the job, it is not limiting the output for 50 rows. Instead it runs for all rows.
This is a very simple job.
The original Job (which I am working on) tries to read input file (multiple spreadsheets with User mode.) in multiple jobs. The file is quite about 5M. But it throws memory error because of the file size.
I thought to use the Event mode, but Event mode is not working with Limit clause as I described in my example job above.
I am assuming you are referring to the generation mode when you talk about user mode or event mode. Irrespective of what this mode is set to, the limit clause should work.
Please find the attached very simple job.
Build id: V5.6.1_20141207_1530
I have also found one more issue:
I have installed Talend 6.4.0 and tried to run the job in event mode.
Now it works only if you provide the Sheet name in the Sheet list. But if you provide the sheet position (i.e. 0) , it is failing.
My job needs to run the excel based on sheet position.
It works if you read all line.
If you put the limit only 10 rows. It will not.
It is clearly not working in my environment with the job I have provided.
Also, can you please check by trying to put 0 in the sheet list under the Basic settings of tfileinput excel.
It will throw the error.
What I have seen that you have raised the bug i 6.4.0 beta.
Now, how can I know which one is the stable release and from where I can download it. It is bit confusing to find from Talend site.
Can you please help in that case.
The first 100 community members completing the Open Studio survey win a $10 gift voucher.
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Watch the recorded webinar!
Pick up some tips and tricks with Context Variables
Learn how media organizations have achieved success with Data Integration
Accelerate your data lake projects with an agile approach