Six Stars

Storing Data from Kafka Input in smaller files

Hi,

 

In my usecase, i am receiving data in Kafka 24*7 in ebcdic. I read the data as byte array from kafkainput and parse the data using the tHMAP. Output of tHmap goes to tjavarow as byte array. output from the tJavaRow is getting stored in tHDFSOutput. Initial tJava component to create the filename with datetimestamp.

 

Here are my challenges

1) my data is getting stored as a single file in hdfs. I need to have them smaller file, as i might need to see the data from morning to afternoon (or to any point of time). Because of me having the data in a single file, i am unable to fetch data.

2) I tried updating the filename in the tJavaRow (that is the reason for inserting the component here). But i am unable to change the file name.

 

Need some suggestions to get this done.

 

Below is a flow, recreated for understanding.

desktop.PNG

 

 

 

 

  • Big Data
  • Data Integration
Tags (1)
4 REPLIES
Nine Stars

Re: Storing Data from Kafka Input in smaller files

just the question, if reason for split only access to timed data - why not query over HDFS files?

- SQL

- Hive

- Drill


-----------
Six Stars

Re: Storing Data from Kafka Input in smaller files

For a day, i am expecting around 100GB data to be received. So, i am storing the data in HDFS with partition. But, the file initially created only being used to store the data despite the partition on date created on run time in tjavarow component. 

Six Stars

Re: Storing Data from Kafka Input in smaller files

any suggestions ?

Nine Stars

Re: Storing Data from Kafka Input in smaller files

Looking on Your original screenshot - You have something wrong in 2 components

 

Not for HDFS, but it work like this:

Screen Shot 2017-06-07 at 11.47.10 PM.png

 

You must define variable before component start work, in my case - tJavaFlex3, an result will be:

Screen Shot 2017-06-07 at 11.47.22 PM.png

-----------