Hadoop hdfs block size

Highlighted
Five Stars

Hadoop hdfs block size

I am unloading data from oracle database into Hadoop.

landing the file as as HDFS file. But the unloaded file is a single file 750 MB in size.

Can anyone please help me understand why its acting so. I would like to have each part file 128 MB in size.

where do i have to update the settings ?


Found 2 items
-rw-r--r-- 3 hdfs supergroup 0 2019-02-07 07:37 /var/lib/hadoop-hdfs/lake/itineraryitem2/_SUCCESS
-rw-r--r-- 3 hdfs supergroup 730087184 2019-02-07 07:37 /var/lib/hadoop-hdfs/lake/itineraryitem2/part-00000

 

Forteen Stars

Re: Hadoop hdfs block size

@badri-nair ,check below link will help you to understand.

https://www.talend.com/blog/2018/04/12/apache-spark-performance-and-tuning-blog/

Manohar B
Don't forget to give kudos/accept the solution when a replay is helpful.

What’s New for Talend Spring ’19

Join us live for a sneak peek!

Sign up now

Tutorial

Introduction to Talend Open Studio for Data Integration.

Watch

Downloads and Trials

Test drive Talend's enterprise products.

Downloads

Definitive Guide to Data Integration

Practical steps to developing your data integration strategy.

Download