Hadoop hdfs block size

Highlighted
Five Stars

Hadoop hdfs block size

I am unloading data from oracle database into Hadoop.

landing the file as as HDFS file. But the unloaded file is a single file 750 MB in size.

Can anyone please help me understand why its acting so. I would like to have each part file 128 MB in size.

where do i have to update the settings ?


Found 2 items
-rw-r--r-- 3 hdfs supergroup 0 2019-02-07 07:37 /var/lib/hadoop-hdfs/lake/itineraryitem2/_SUCCESS
-rw-r--r-- 3 hdfs supergroup 730087184 2019-02-07 07:37 /var/lib/hadoop-hdfs/lake/itineraryitem2/part-00000

 

Forteen Stars

Re: Hadoop hdfs block size

@badri-nair ,check below link will help you to understand.

https://www.talend.com/blog/2018/04/12/apache-spark-performance-and-tuning-blog/

Manohar B
Don't forget to give kudos/accept the solution when a replay is helpful.

2019 GARNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

Modern Data Engineering in the Cloud

Learn about modern data engineering in the Cloud

Watch Now

Put Massive Amounts of Data to Work

Learn how to make your data more available, reduce costs and cut your build time

Watch Now

How to deploy Talend Jobs as Docker images to Amazon, Azure and Google Cloud reg...

Learn how to deploy Talend Jobs as Docker images to Amazon, Azure and Google Cloud registries

Blog