I have requirement where i need to use Parquet Component, but i can see that parquet can be used in Big data batch , is there any way were i can use these components in standard jobs ?
or are there any other components which can replace Parquet components?
You can create a hive table with parquet serde on the location that is required and insert data from the hive table of the previous file.
Hive tmp table(any file format) to Hive table (parquet file format) this is one thing i could think of ?
Can you explain your use case ?
Please feel free to vote for this new feature jira issue:https://jira.talendforge.org/browse/TBD-4349
Introduction to Talend Open Studio for Data Integration.
Practical steps to developing your data integration strategy.
Create systems and workflow to manage clean data ingestion and data transformation.