One Star

How to load all the tables into CSV files dynamically?

Hi Experts,
I am using TOS 6.1 and MSSQL 2008 r2.
I would like to implements the following scenario :
Load all the tables in to CSV file(s) dynamically from MSSQL.
 I read one by one table name from .txt file and Prepare a query like "Select * from context.CurrentTableName".
If i have 100 tables , i will generate  100 queries dynamically, and fetch the 100 table data into csv file(s).
i have verified but no luck.
could any one help me on this, how to achieve ,load the table information with out schema being mentioned ?
Note: i don't have a option to select Dynamic Type.
Thanks you.
8 REPLIES
Moderator

Re: How to load all the tables into CSV files dynamically?

Hi,
There is dynamic schema feature  which is only available in Talend Subscription Solution. It allows you to design schema with an unknown column structure (unknown name and number of columns).
Here is a related KB article about:TalendHelpCenter: How to process changing data structure?

Talend provides cmmunity user with a free trail of Talend Data Integration.
You can download it freely from:https://info.talend.com/prodevaltedi.html?_ga=1.40458978.1751069238.1426242863
Best regards

Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Fifteen Stars

Re: How to load all the tables into CSV files dynamically?

You can do this without the dynamic schema so long as you know the max number of columns you will have to work with. You can use the query......
"Select * from " + context.tablename

.... and configure your db component with 100 String columns (for example, assuming no more than 100 columns). This will mean that all of your columns are of type String, but then that doesn't matter if you are dumping the data to a CSV.
Rilhia Solutions
One Star

Re: How to load all the tables into CSV files dynamically?

Hi,
Thanks for quick reply.
Actually i am looking for a solution without using Dynamic.
How can i achieve it?
Fifteen Stars

Re: How to load all the tables into CSV files dynamically?

Hi Thiru.gdm,
The solution you mention in your first post with what I suggested should work. The only difficulties will be ensuring you have enough columns (but I am assuming you can find the total number of columns in your widest table) and column names. For column names (to use as headers in your CSV files) you will have to query your database in a separate query. Something like .....
SELECT COLUMN_NAME,* 
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = 'YourTableName' AND TABLE_SCHEMA='YourSchemaName'

But other than that, it should be pretty straight forward.
Rilhia Solutions
One Star

Re: How to load all the tables into CSV files dynamically?

Hi,
I have used same query to get the headers. How can we store that result in to output file?
Thanks in advance.
Fifteen Stars

Re: How to load all the tables into CSV files dynamically?

You can use a tUnite component so long as the schema of your headers data match that of the table data. Basically, if you are using the idea of 100 columns for your table data, the header data must also have a schema with 100 columns and the columns must be named the same. 
Rilhia Solutions
One Star

Re: How to load all the tables into CSV files dynamically?

Hi,
your idea worked for me,  took 50 columns and  trimmed unwanted columns based on column count.
If anyone wants the same kind of scenario please follow the steps below:
Need to create 3 context variables: Current_TableName, Column_Count, Column_List
1. Get the list of all table(s) name (tMSSQLInput: SELECT dbo.sysobjects.ID, dbo.sysobjects.Name FROM dbo.sysobjects WHERE xtype='U' )
2. pass the each table name to child job (context.Current_TableName)
3. Receive the table name and get the columns list ( tMSSQLTableList---->tMSSQLColumnList--->tFixedFlowInput)
4. capture the list of columns and column count which comes from tFiexedFlowInput.
output_row.ColumnName = input_row.ColumnName;
5. write the headers to a file (give file name as dynamic like filename=Current_TableName +".csv")
6. Get the table data using tMSSQLInput (give schema of table like column1, column2, column3 etc.. upto 50 columns. because we don't know the exact columns as we are getting table name as dynamic)
7. write the data in to file (Current_TableName +".csv")
Here our table data looks like following
ID,Name,City,email
1,nani,MN,testTalend.gdm@gmail.com,,,,,,,,,,,,,,,,,,,,,,,,,, etc separated with 50 columns as we have schema upto 50 columns (we have to remove extra comas)
8. Using tFileInputFullRow, read each and every line and pass it to tjavaRow
9. Here we have to trim unwanted columns using Column_Count context variable .
String val=input_row.line;
String parts[]=val.split(",");
String outputValue=null;
for(int i=0;i<parts.length;i++)
{
if(i<context.Column_Count )//context.Column_Count
{
//write logic here parts
//and assign value output_row.line =outputValue;
}
}
output looks like:
ID,Name,City,email
1,nani,MN,testTalend.gdm@gmail.com
likewise it will create all the tables data dynamically.
Thanks,
Fifteen Stars

Re: How to load all the tables into CSV files dynamically?

Glad you got it working Thiru.gdm. 
Rilhia Solutions