Am extracting data from snowflake, there are 150K rows to be extracted and and load into oracle Db, but it is taking nearly 3 hours to get work done. Can anybody give an optimized solution to improve the speed
Thanks In Advance
Are you facing issue while extracting data from Snowflake or while writing data to Oracle?
Could you please write the data to a file instead of Oracle DB and check the speed?
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved
The native snowflake components doesn't provide much of an option to tweak the reading and writing terms. You could try your hands out and measure the performance by using jdbc components.
tJDBCInput - this could give you the option to use cursor size. tJDBCRow component could be used to process specific SQL statements against a JDBC Snowflake connection.
Or if you are in higher version (7.x) of Talend could use the feature of bulk load by using tSnowflakeOutputBulk- which writes a file with data to an internal Snowflake storage or other storage including S3 and Azure.
The first 100 community members completing the Open Studio survey win a $10 gift voucher.
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Watch the recorded webinar!
Learn how to make your data more available, reduce costs and cut your build time
Read about OTTO's experiences with Big Data and Personalized Experiences
Pick up some tips and tricks with Context Variables