How to write to multiple files within same big data sparkstreaming job

One Star

How to write to multiple files within same big data sparkstreaming job

My use case is to stream from either Kafka or Kinesis (we are AWS), buffer windows of data for one minute intervals and then write out the results from each one minute buffer to either HDFS or S3.  The issue is that each buffer must be written to a separate file?
Is this possible using a talend spark big data streaming job? 
Moderator

Re: How to write to multiple files within same big data sparkstreaming job

Hi,
Sorry for delay!
We have redirected your issue to Talend Bigdata experts and then come back to you as soon as we can.
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.