We are currently using Talend Open Studio for Big Data in our POC. When the cluster is not yet available, we made use of tFileInputDelimited to read our input csv files. In the said component, we ticked the CSV options to enable the text enclosure and escape char. We used this for talend to recognize columns containing comma to be considered as one field.
But when the Hadoop cluster was already available, we need to read the files from the HDFS. We did this using the tHDFSInput component. Unfortunately, it seems that the tHDFSInput component has no CSV options making the column that contains a comma to be separated into two columns.
Is there any way to read a file from the HDFS so that the columns containing a comma to be treated as one column?
The Text Enclosure' option is addressed by tfileinputdelimited component.
Here is a jira issue:https://jira.talendforge.org/browse/TDI-33068