Hi, need help in loading tsv files into snowflake. the files are stored in s3.
From Talend 7.1.1 version there is the component (tSnowflakeBulkExec) from which you can load data directly from S3 to Snowflake.This component mimics the copy command of Snowflake. There will not be any on-premise files involved.
For Storage you have to select S3 and select an Amazon S3 bucket to load data from. And mention rest of the properties as per configuration - Region, Access Key and Secret Key, Bucket, Folder.
Link - here.
But if you are on lower version then you could use a tJDBCRow component to connect to Snowflake via JDBC and execute a COPY command to load the data into a table in Snowflake. This method leverages the bulk-loading capability of Snowflake without requiring the data to be pulled into Talend.
You have to first get the file from S3 using tS3Get. Once the file is in your local, you can read them using tfileinputdelimited and write to Snowflake using tDBOutput(Snowflake).
If the file is big, I would advise you to use the Snowflake Bulk components.
Another approach (which is better) is as shown in previous post. I didn't see the comment while typing the reply.
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Watch the recorded webinar!
Learn how to make your data more available, reduce costs and cut your build time
Read about OTTO's experiences with Big Data and Personalized Experiences
Take a look at this video about Talend Integration with Databricks