Can anybody help me,while i am running talend job with copy command its throwing error like.
insufficient data left in message during my COPY from S3 to Redshift.
Can anybody help me.
It is not a Talend error message
it is Redshift (Postgres)
You can check, what happens exactly by select data from sql_load_errors:
select d.query, substring(d.filename,14,20), d.line_number as line, substring(d.value,1,16) as value, substring(le.err_reason,1,48) as err_reason from stl_loaderror_detail d, stl_load_errors le where d.query = le.query and d.query = pg_last_copy_id();
Hey Vapukov Thank you for your reply.
But,i am able to run same copy command from Redshift and i am able to load data into redshift from S3. Where as using Talend i am unable to load data into redhsift,throwing error like : [Amazon](500310) Invalid operation: insufficient data left in message;
Could you suggest me on this.
Same == prepared by talend, transferred to S3 and then imported by Redshift? or it "similar" data? (which You trust look same ... but we not trust nobody )
did You run command from links? what it show?
Same data i used.When i am using copy command which is ran successfully into redshift.while i am trying to load same data using talend,some how it throwing an error on talend.
source file delimiter is '~' .
Also,please find below copy command which i used for redshift load.
ESCAPE ACCEPTANYDATE ACCEPTINVCHARS EMPTYASNULL FILLRECORD IGNOREBLANKLINES IGNOREHEADER 1 TRIMBLANKS dateformat'auto' NULL AS '\0' delimiter '~'
Please help me how to run this job through talend.
May I ask, is this error specific to the Talend AWS Quickstart, or is it a generic error that you are encountering with Redshift and Talend? If the latter, please post the question to the Design and Development forum.
While i am loading data from s3 bucket to redshift using copy command its throwing error as : Extra column(s) found .
Could please help me on this.