tHiveLoad set partitions, make the partition be system date.

One Star

tHiveLoad set partitions, make the partition be system date.

Hi everyone:
    I am using tHiveLoad to load data into hive table, my hive table is as follow:
    hive> show create table jigou;
OK
CREATE TABLE `jigou`(
  `id` string, 
  `column1` string, 
  `column2` string, 
  `column3` string, 
  `column4` string, 
  `column5` string, 
  `column6` string, 
  `column7` string, 
  `column8` string, 
  `column9` string, 
  `column10` string, 
  `column11` string, 
  `column12` string)
PARTITIONED BY ( 
  `dt` string)
ROW FORMAT DELIMITED 
  FIELDS TERMINATED BY '\u0001' 
STORED AS INPUTFORMAT 
  'org.apache.hadoop.mapred.TextInputFormat' 
OUTPUTFORMAT 
  'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
  'hdfs://h1:8020/apps/hive/warehouse/zhangchao.db/jigou'
TBLPROPERTIES (
  'transient_lastDdlTime'='1435650326')
Time taken: 0.474 seconds, Fetched: 26 row(s)
My talend work flow is as follow, I want to make the dt partition dynamic, each time the job is scheduled, it read the system date automaticly.

Any help will be pleased. 
Five Stars

Re: tHiveLoad set partitions, make the partition be system date.

change it to "dt='2015-06-30'"
above should work for you,  I tested this sqoopLoad but you can test with Hive Load, as per my knowledge it requires string values enclosed with single quote. 
and for dynamic date you can use below function. 
"dt='"+TalendDate.getDate("yyyy-MM-dd")+"'"
One Star

Re: tHiveLoad set partitions, make the partition be system date.

Thanks? I have used context variables to finish my goal.

Calling Talend Open Studio Users

The first 100 community members completing the Open Studio survey win a $10 gift voucher.

Start the survey

2019 GARNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

Put Massive Amounts of Data to Work

Learn how to make your data more available, reduce costs and cut your build time

Watch Now

How OTTO Utilizes Big Data to Deliver Personalized Experiences

Read about OTTO's experiences with Big Data and Personalized Experiences

Blog

Talend Integration with Databricks

Take a look at this video about Talend Integration with Databricks

Watch Now