We had a cron job on our server with talend that download a document and process it from our ftp. It was working for years but now we changed to AWS (Amazon Web Services) and we are having troubles to connect it to the ftp. So the problem is not about the code, but about the server. We don't have much information on why it fails but our log says:
Exception in component tFTPGet_2 2: No such file at com.jcraft.jsch.ChannelSftp.throwStatusError(ChannelSftp.java:2297) at com.jcraft.jsch.ChannelSftp._realpath(ChannelSftp.java:1831) at com.jcraft.jsch.ChannelSftp.cd(ChannelSftp.java:268) at datatransaction.datatransaction_0_1.DataTransaction.tFTPGet_2Process(DataTransaction.java:6664) at datatransaction.datatransaction_0_1.DataTransaction.tMysqlInput_1Process(DataTransaction.java:4174) at datatransaction.datatransaction_0_1.DataTransaction.tJavaFlex_3Process(DataTransaction.java:3666) at datatransaction.datatransaction_0_1.DataTransaction.tMysqlConnection_1Process(DataTransaction.java:3576) at datatransaction.datatransaction_0_1.DataTransaction.tFileInputXML_1Process(DataTransaction.java:3447) at datatransaction.datatransaction_0_1.DataTransaction$17.run(DataTransaction.java:19543) Build step 'Execute shell' marked build as failure Finished: FAILURE
What can be the problem we are having? The files are on the path we specify, the user is correctly created and has permisions. It's not a problem about the cron because when we use the code via ssh the problem is the same.
We have checked the log from the old server and what it returns is the same (only changes the ip). We have the next code where we have the error on this new server:
file [price_2018.xml] downloaded successfully. 1 files have been downloaded.
The file is at the same place, no code or folders have been changed.
Any ideas on what could be the problem or how can we have a more extensive feedback from the job?
Solved! Go to Solution.
Introduction to Talend Open Studio for Data Integration.
Practical steps to developing your data integration strategy.
Create systems and workflow to manage clean data ingestion and data transformation.