I have requirement where i need to use Parquet Component, but i can see that parquet can be used in Big data batch , is there any way were i can use these components in standard jobs ?
or are there any other components which can replace Parquet components?
You can create a hive table with parquet serde on the location that is required and insert data from the hive table of the previous file.
Hive tmp table(any file format) to Hive table (parquet file format) this is one thing i could think of ?
Can you explain your use case ?
@lmit,Parquet Components only be available if you create a Big Data Batch job in licensed Talend Bigdata.
Please feel free to vote for this new feature jira issue:https://jira.talendforge.org/browse/TBD-4349
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Watch the recorded webinar!
Move from On-Premises to the Cloud by following the advice of experts
Learn about modern data engineering in the Cloud
Learn how to deploy Talend Jobs as Docker images to Amazon, Azure and Google Cloud registries