Am extracting data from snowflake, there are 150K rows to be extracted and and load into oracle Db, but it is taking nearly 3 hours to get work done. Can anybody give an optimized solution to improve the speed
Thanks In Advance
Are you facing issue while extracting data from Snowflake or while writing data to Oracle?
Could you please write the data to a file instead of Oracle DB and check the speed?
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved
The native snowflake components doesn't provide much of an option to tweak the reading and writing terms. You could try your hands out and measure the performance by using jdbc components.
tJDBCInput - this could give you the option to use cursor size. tJDBCRow component could be used to process specific SQL statements against a JDBC Snowflake connection.
Or if you are in higher version (7.x) of Talend could use the feature of bulk load by using tSnowflakeOutputBulk- which writes a file with data to an internal Snowflake storage or other storage including S3 and Azure.
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Part 2 of a series on Context Variables
Learn how to do cool things with Context Variables
Find out how to migrate from one database to another using the Dynamic schema