Import CSV to MySQL with Dynamic Schema (error when column length is large or text type)

Highlighted
Eight Stars

Import CSV to MySQL with Dynamic Schema (error when column length is large or text type)

I wanted to import CSV files into MySQL database using dynamic schema, as I will have to deal with large amount of files.

However, there are errors when the column length is more than 200 or when the column type is text.

Is there a way to fixed or modified this? while using dynamic schema

I can import data from other database (MSSQL -> MySQL, however when I change the source to CSV it seems not work in the same way)

 

Here is the job design

Import CSV's Job DesignImport CSV's Job Design

 

Error Message : [Table: tbl_inq]   column type text

Exception in component tDBOutput_1 (Import_CSV)
java.sql.BatchUpdateException: Data truncation: Data too long for column 'biko' at row 14

Error Message: [Table: tbl_mem] column length 1000

Exception in component tDBOutput_1 (Import_CSV)
java.sql.BatchUpdateException: Data truncation: Data too long for column 'comment' at row 14

Here's each table schema

tbl_inq's schematbl_inq's schematbl_mem's schematbl_mem's schema

Highlighted
Moderator

Re: Import CSV to MySQL with Dynamic Schema (error when column length is large or text type)

Increase the column length for comment. Since the data for that column you were getting more than the existing length.
Manohar B
Don't forget to give kudos/accept the solution when a replay is helpful.
Highlighted
Eight Stars

Re: Import CSV to MySQL with Dynamic Schema (error when column length is large or text type)

@manodwhb 

Here I tried to increase column length to 500.

However, the overall row sized become too large and the following error occurs.

Job Design (increase column length)Job Design (increase column length)

JavaRow's Code

 

Dynamic dyn = row1.dynamic_column;

for(int i = 0; i < dyn.getColumnCount(); i++){
	DynamicMetadata meta = dyn.getColumnMetadata(i);
	meta.setLength(500);
}
row3.dynamic_column = dyn;

 

 

Highlighted
Moderator

Re: Import CSV to MySQL with Dynamic Schema (error when column length is large or text type)

Set the length to 1000
Manohar B
Don't forget to give kudos/accept the solution when a replay is helpful.
Highlighted
Eight Stars

Re: Import CSV to MySQL with Dynamic Schema (error when column length is large or text type)

@manodwhb
I set the length in tJavaRow and tDBOutput's column length to 1000 but still the same error occurred.

Exception in component tDBOutput_1 (Import_CSV)
java.sql.SQLSyntaxErrorException: Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. This includes storage overhead, check the manual. You have to change some columns to TEXT or BLOBs

Highlighted
Eight Stars

Re: Import CSV to MySQL with Dynamic Schema (error when column length is large or text type)

@manodwhb 

I think I know the problem now (but no idea about the solution yet...).

I'm using dynamic schema(column), so each column's values (from Raw CSV) are joined together into a row called 'dynamic_column' below.

'Dynamic_column'(row)'s  will work if the total number of characters per row is less than 250 characters.

If the total number of characters is more than 250 characters, 'row size is too large' error will occurred. 

 

Raw CSV

CSV's Row SampleCSV's Row Sample

Dynamic Row (will work if the total character length is less than 250)

importcsvrow.png

ADA00 - ADAAA - 株式会社○○○○(○○○○) - ○○ ○○ - 104-00○○ - 東京都○○○○ - ○○○○ - 03-○○○○-○○○○ - 03-○○○○-○○○○- info@○○○○ - 月~金曜 09:30~18:00<br>土曜 09:30~15:00、日曜・祝日休み - 1969/12/31 15:00 - 1969/12/31 15:00 - 34200 - 34200 - 弊社は○○○○のタイ旅行専門店です。○○○○○○○○○○○○○○○○○○○○"の日本販売総代理店でもあります。

 

However, the total row size limit is 65,535 bytes, so the row's total character limit should be working for 250 or more characters as well.

MySQLSyntaxErrorException: Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. 
You have to change some columns to TEXT or BLOBs

 

In this case, it will likely be impossible to import CSV files with more 30 columns, which I think is not realistic in real-life case.

Is there a way to import CSV files with large number of columns(large row size)??

 

 

2019 GARTNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

Talend Cloud Available on Microsoft Azure

An integration platform-as-a-serviceto help enterprises collect, govern, transform, and share data from any data sources

Watch Now

Self-service Talend Migration: Moving from On-Premises to the Cloud

Move from On-Premises to the Cloud by following the advice of experts

Read Now

How to deploy Talend Jobs as Docker images to Amazon, Azure and Google Cloud reg...

Learn how to deploy Talend Jobs as Docker images to Amazon, Azure and Google Cloud registries

Blog