One Star

tFileExcelWorkbookOpen

I want to load data from a table into 2 sheets of same excel file. I want to load that data into a template file. How can I do this using below components.
tFileExcelWorkbookOpen
tFileExcelSheetOutput
tFileExcelWorkbookSave .

Thanks .
9 REPLIES
Seventeen Stars

Re: tFileExcelWorkbookOpen

At first open with tFileExcelWorkbookOpen your template excel file (btw template means an normal excel file, not an template in the meaning of MS Office).
Go with OnSubjobOk to the next sub job:
DBOutout_1 ---- (tMap_1) ----- tFileExcelSheetOutput_1 (use your first sheet name here)
Go with OnSubjobOk to the next sub job:
DBOutout_2 ---- (tMap_2) ----- tFileExcelSheetOutput_2 (use your second sheet name here)
Go with OnSubjobOk to the tFileExcelWorkbookSave and here use the target file to save your excel file under a new name (probably adding timestamps etc. to the name).
Is this post helpful for you? If you have more questions or it is not clear enough, don't hesitate contact me.
One Star

Re: tFileExcelWorkbookOpen

Hi jlolling,
Thank u so much for ur reply. Now I'm able to get the output as per my requirement. Thanks again.
One Star

Re: tFileExcelWorkbookOpen

Hi All,
I want to create dynamic sheets in a single excel file with the help of tFlowtoIterate component.
I'm using tFileExcelWorkbookOpen, tFileExcelSheetOutput , tFileExcelWorkbookSave
components in my job.
First sheet is being created successfully.. It is throwing error " workbook cannot be null " when the iteration runs for the 2nd sheet.
Below are the screen shots. Kindly help me to solve this.

Thanks,
Satya
One Star

Re: tFileExcelWorkbookOpen

Hi Lolling,
I'm getting data with a tgoogleAnalyticsInput instead of a DBOutput. Do I have to create two tgoogleAnalyticsInput components in order to be able to create two sheets in one Excel file with this method?
If yes, I can't seem to be able to go with OnSubjobOk from tFileExcelSheetOutput_1 to the next job (that should start with tgoogleAnalyticsInput in this case)
Thank you!
Seventeen Stars

Re: tFileExcelWorkbookOpen

@csc.satya: please provide a screenshot of your job. Creating new sheets in a iteration is a normal use case and should also work for you.
@MedAchraf: I guess you want to store your google analytics data in different sheets for different profiles or web properties. I suggest you read the profiles of your interest from a file or from a database table and change the flow (with one profile in every data set) in a iteration and call the tGoogleAnalyticsInput component for it).
tFileExcelWorkbookOpen --OnSubjobOk--> tFileInputDelimited (list of profiles) --main--> tFlowToIterate --iterate-->tGoogleAnalyticsInput --> tFileExcelSheetOutput
from tFileInputDelimited --OnSubjobOk--> tFileExcelWorkbookSave (saving the excel file)
I hope this makes a bit clear what I mean.
One Star

Re: tFileExcelWorkbookOpen

I see, I'm going to try to do that. However, I already have tSetProxy --OnSubjobOK-->tGoogleAnaltyiticsInput
so I can't iterate the tGoogleAnaltyiticsInput component at the same time.
Seventeen Stars

Re: tFileExcelWorkbookOpen

Of course you can. You do not need for every iteration the component tSetProxy. You can do the proxy setting at the very start of your job only once. This is a setting for the JVM and it does not make sense to do that twice.
One Star

Re: tFileExcelWorkbookOpen

@jlolling hi 
i am facing issue with loading data into excel file my job is simple please find screen shot 


i am using tHash component as there is huge data to be loaded into excel file some 650K+ records
i am running this job in QA server using remotely from studio find screen shot for same


you can see highlighted with yellow is target execution , even if i run my job in server i get below error 



i am unsure how to load such big data into file is there is any other component for this?
please help?
Nine Stars

Re: tFileExcelWorkbookOpen

Why are you doing tHashOutput and tHashInput?
Why not just go tJDBCInput to tFileOutputExcel?
You could allocate more RAM memory for use by Talend or you need to limit the number of rows being processed at a single time.