tSnowflakeinput

Highlighted
Nine Stars

tSnowflakeinput

Hello all

 

Am extracting data from snowflake, there are 150K rows to be extracted and and load into oracle Db, but it is taking nearly 3 hours to get work done. Can anybody give an optimized solution to improve the speed 

 

Thanks In Advance 

Manish

Highlighted
Employee

Re: tSnowflakeinput

Hi Manish,

 

    Are you facing issue while extracting data from Snowflake or while writing data to Oracle?

 

    Could you please write the data to a file instead of Oracle DB and check the speed?

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved

Highlighted
Nine Stars

Re: tSnowflakeinput

The native snowflake components doesn't provide much of an option to tweak the reading and writing terms. You could try your hands out and measure the performance by using jdbc components. 
tJDBCInput - this could give you the option to use cursor size. tJDBCRow component could be used to process specific SQL statements against a JDBC Snowflake connection.
Or if you are in higher version (7.x) of Talend could use the feature of bulk load by using tSnowflakeOutputBulk- which writes a file with data to an internal Snowflake storage or other storage including S3 and Azure.

2019 GARTNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

Best Practices for Using Context Variables with Talend – Part 2

Part 2 of a series on Context Variables

Blog

Best Practices for Using Context Variables with Talend – Part 1

Learn how to do cool things with Context Variables

Blog

Migrate Data from one Database to another with one Job using the Dynamic Schema

Find out how to migrate from one database to another using the Dynamic schema

Blog