I have requirement where i need to use Parquet Component, but i can see that parquet can be used in Big data batch , is there any way were i can use these components in standard jobs ?
or are there any other components which can replace Parquet components?
You can create a hive table with parquet serde on the location that is required and insert data from the hive table of the previous file.
Hive tmp table(any file format) to Hive table (parquet file format) this is one thing i could think of ?
Can you explain your use case ?
@lmit,Parquet Components only be available if you create a Big Data Batch job in licensed Talend Bigdata.
Please feel free to vote for this new feature jira issue:https://jira.talendforge.org/browse/TBD-4349
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Watch the recorded webinar!
This video will show you how to add context parameters to a job in Talend Cloud
This video will show you how to run a job in Studio and then publish that job to Talend Cloud
This video will help someone new to using Talend Studio get started by connecting to Talend Cloud and fetching the Studio License