Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

Highlighted
Eight Stars

Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

Hi,

I just recently tried using Talend Cloud Data Integration (Talend Studio 7.2)

and I'm figuring how to dynamically migrate a whole MSSQL server database to MySQL(blank DB).  

 

So far, I heard that it's possible to do that with Dynamic Schema.

However, I only found a tutorial for a table (data) migration, or some post that didn't specifically explained much about settings and components to use.

 

Here's my job design (UPDATED!!!)

 

I'm using global variable to dynamically define the table, however, I'm not sure how i can define the schema dynamically.

**I supposed my query is right.

Can you let me know how I can define the schema dynamically here?

 

dbinput.png

 

Output Component

output.png

 

 

 

 

Spoiler

The other option I found was to use job template, however, it doesn't help deal with syntax error that occurred during migration like the following.

 

sample1)

Exception in component tDBOutput_1 (Job_tbl_mem)
java.sql.SQLSyntaxErrorException: BLOB, TEXT, GEOMETRY or JSON column 'msrepl_tran_version' can't have a default value
	at com.mysql.cj.jd

msrepl.png

 

sample2)

[FATAL]: local_test.job1_tbl_mem_0_1.Job1_tbl_mem - tDBOutput_4 Invalid default value for 'mobile_inq'
java.sql.SQLSyntaxErrorException: Invalid default value for 'mobile_inq'
	at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:120)
	at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
	at 

mobile_inq.png

 

 

In addition, I was hoping to find a solution where one job could perform a whole database migration. Eg) Iterating tables in a source DB then generate an output a destination DB.

 

Please let me know either if I should proceed with Dynamic Schema or Job Template + job design/components/setting needed.

 

Thank you


Accepted Solutions
Highlighted
Six Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

Hello,

You will find below a job allowing to migrate data from one database to another.
I do it from SQL server to SQL Server but you can migrate it wherever you want.

 

2020-03-03_13h51_16.png

 

 

The first step is to open the connections to the two databases.
Then, you list the name of the current tables in the source.
You iterate and make a "select * from [global_variable]" with the dynamic schema.

I advise you to add the '[]' if you have tables with keywords like 'user' or 'group'.

 

2020-03-03_14h00_47.png

 

Finally, for the target, you choose Create table if note exists or drop and create, it's up to you.

 

2020-03-03_14h01_48.png


The best practices is to recover the entire table structure with "tColumnList" and do a mapping on a case by case basis (look at the example of Talend cloud of MySql migration from AWS to Snowflake, you can download the job if you still have the trial version)


Do not hesitate if you have questions.

View solution in original post


All Replies
Highlighted
Community Manager

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

To use the Dynamic Schema you need to have a paid for version of Talend. It is only available there. If you are using such a version, maybe this blog post will help you....

https://www.talend.com/blog/2019/11/11/migrate-data-between-databases-one-job-using-dynamic-schema/

Highlighted
Eight Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

Thanks alot!!

I am on 14 days free trial for Cloud Version.
I have read that blog a couple of times, but what I wanted to do here is a bit different ....
I wanted to have a whole database migrated (if possible with simple job design).
So if possible, I was hoping someone could help point on how I can make if work by improving my current job design.
Highlighted
Community Manager

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

The only difference between what this blog shows and your requirement is a section to iterate over the different tables in your DB. You would pass the table name into this job and call it from a parent job. I believe I explain the process towards the end of blog

Highlighted
Eight Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

I see,
so in this case I have a job to iterate over the tables already. So I could connect it to 2. "Identify column names" (tJavaFlex) as in the blog?

 

jobdesign.png

 

I can kinda relate it to my case here.
However, number of columns and data types in each table differs, so I was wondering if it will still work the same?

 

I've used a single database, but in reality you will likely be using different databases. It doesn't make much 
difference, but you will need to make sure that the database column types are the same if you are following this.
It would be possible to add some code to dynamically change the column types, but this would require extra data in
the column mapping table and some extra Java code. This is not covered here.

Is what I'm trying to do here, the case you mention here(two different databases)? (the need to add extra data and Java Code)

If yes, would you mind to help walk out the procedure to add extra data and Java Code?

 

Thanks a lot

 

Highlighted
Eight Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

I tried to build up job design according to the blog, but here I encountered an error.

 

 "the component is not a sub process start can not have any link on component ok/ error in input"

for DBInput1 when I tried to trigger it with OnSubjobOk.

 

subjob.png

Highlighted
Community Manager

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

The job described in the blog needs to be left alone. You would create a parent job and place that job inside it (using a tRunJob component). That would be linked to your table list component via an iterate link. You would create a context variable to hold the table name and pass the value from the table list to context variable in the tRunJob. You are essentially repetitively calling the job in the example but supplying the table name and a query.

 

Investigate the tRunJob component in the documentation. Also look at how you can build a select statement dynamically for each table using your database. You will need this before you can do this. Building a select statement dynamically will be described on sites for the database you are using.

Highlighted
Six Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

Hello,

You will find below a job allowing to migrate data from one database to another.
I do it from SQL server to SQL Server but you can migrate it wherever you want.

 

2020-03-03_13h51_16.png

 

 

The first step is to open the connections to the two databases.
Then, you list the name of the current tables in the source.
You iterate and make a "select * from [global_variable]" with the dynamic schema.

I advise you to add the '[]' if you have tables with keywords like 'user' or 'group'.

 

2020-03-03_14h00_47.png

 

Finally, for the target, you choose Create table if note exists or drop and create, it's up to you.

 

2020-03-03_14h01_48.png


The best practices is to recover the entire table structure with "tColumnList" and do a mapping on a case by case basis (look at the example of Talend cloud of MySql migration from AWS to Snowflake, you can download the job if you still have the trial version)


Do not hesitate if you have questions.

View solution in original post

Highlighted
Community Manager

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

Nice solution @JohnRMK 

Highlighted
Eight Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

@cbma_1994

Updated!!
I have tried what you showed me here and I wanted to ask a few things.
1) Is 'DROP' a DBRow component? If yes, how do you set it up? use output Db connection?

What about table name, schema, query? All ((String)globalMap.get("tDBTableList_1_CURRENT_TABLE")??

2) Would you mind showing me how you send up the JavaFlex?

3) For DbInput,  how you define schema for Input? I tried adding a dynamic column.... but not sure if it's right.

And table name should be left blank?

4) I tried execute the job but there's an error.

 

Seems like the cause was the mismatch of columns between input and output table.

Is there a way to make it when columns (both number and types) don't match?

Ideally, the output DB might have only 1 or no table at all when I will use it for actual work.

(Source -> Blank Output DB)

 

Please let me know if anything is unclear to you.

 

Thanks alot!!

Highlighted
Eight Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

Thank you for your advice.

I have created a parent job as you mentioned and connected it to the child job(from the blog).

However, there are few things I'm not sure about.

 

1) How should I set schema for tRunJob component? Is it a dynamic column like the source?

Do I need to set dynamic job or context parameter?

 

2) How should I create and set context variable? Is it something like the second screenshot?

Can you show me example?

 

3)  The number of columns and data types in each table differs, so  I was wondering how and what kind of data/code I need to add? As my database contains about a hundred tables and each of them have different number of column and data types. Would you mind to help walk me through the procedure to add extra data and Java Code?

 

I've used a single database, but in reality you will likely be using different databases. It doesn't make much 
difference, but you will need to make sure that the database column types are the same if you are following this.
It would be possible to add some code to dynamically change the column types, but this would require extra data in
the column mapping table and some extra Java code. This is not covered here.

 

 

Parent JobParent Job

 

context variablecontext variable

 

I use a global variable ((String)globalMap.get("tDBTableList_1_CURRENT_TABLE")), to define table name in the child job(blog), so I thought about defining it in a similar way here, but I'm don't know how.

Child Job (from the blog)Child Job (from the blog)

 

In addition, you mentioned in the blog about multiple tables that there are 3 additional touch-up needed.

I think number 1 is what you mentioned earlier,

but I'm hoping if you can elaborate more and show examples for number 2 and 3?

If it's needed in my case, please let me know.

 

1. Use Context Variables in your DB components
2. Add columns to your Column_Mapping table to hold the Table Name
You will need to add a bit more supporting data to your Column_Mapping table. If you add an "Old_Table_Name" column 
and a "New_Table_Name" column, you can query the Column_Mapping table using the "Old_Table_Name" field and the
context.source Context variable. That will return the mapping configurations for your source table and return the
new table name. This will need to be set as your context.target Context variable value. 3. Create a wrapper Job to call this Job and supply the Table data as Context Variables The final step for this will be to create a wrapper Job. This is a Job that will query a data set (maybe your
Column_Mapping) table to return a list of source tables to be migrated. This data will then be sent to this Job,
run using a tRunJob. For every source table identified in the wrapper Job, this Job will be run. Therefore you can
start the wrapper Job, it will return each of the source tables and this Job will dynamically run for each of them.

for you reference, this is the mapping_column input

column_mappingcolumn_mapping

 

Please let me know if you need any clarification.

 

Thanks a lot

Highlighted
Six Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

Hello,

"Drop" is the name I gave to a tDBRow component where I execute a query to delete all the tables (<< "EXEC sp_MSforeachtable 'DROP TABLE ?'" >>

it's not necessary and you can remove it because you delete the table in the action on the table in tDBOuput)

 

If you have a mapping error, it means that you do not delete the table before the extraction.
Use "delete table if it exists"

 

For the tJava component, it is empty. It is there because tDBInput is a stream input component and does not allow a main link before it.

 

For the schema, there is only one column (name=data => type=dynamic) and you map all the possible tables. I used it on projects with tables of 800 columns and 3 columns without worries.

The only possible errors are due to the config of the database (see the parameters for rounding or truncation of char)


The job I sent you is working properly and I am using it to clone a db.
You have to add controls for example you test if there is a record in the source table and with an IF trigger, you execute the data extraction.

Highlighted
Eight Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

Thanks for getting back to me.

I made changes as you mentioned, however another error comes up again.

 

It said SQLServerException:  'connection is close'

sql_connection errorsql_connection error

Full Error Message

Spoiler

Starting job test04 at 19:44 04/03/2020.

[statistics] connecting to socket on port 3944
[statistics] connected
Exception in component tDBTableList_1 (test04)
com.microsoft.sqlserver.jdbc.SQLServerException: 接続は閉じられています。
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(SQLServerException.java:191)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.checkClosed(SQLServerConnection.java:710)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.checkClosed(SQLServerStatement.java:1071)
at com.microsoft.sqlserver.jdbc.SQLServerResultSet.checkClosed(SQLServerResultSet.java:387)
at com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1009)
at local_test.test04_0_1.test04.tDBTableList_1Process(test04.java:545)
at local_test.test04_0_1.test04.runJobInTOS(test04.java:3244)
at local_test.test04_0_1.test04.main(test04.java:3031)
[FATAL]: local_test.test04_0_1.test04 - tDBTableList_1 接続は閉じられています。
com.microsoft.sqlserver.jdbc.SQLServerException: 接続は閉じられています。
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(SQLServerException.java:191)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.checkClosed(SQLServerConnection.java:710)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.checkClosed(SQLServerStatement.java:1071)
at com.microsoft.sqlserver.jdbc.SQLServerResultSet.checkClosed(SQLServerResultSet.java:387)
at com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1009)
at local_test.test04_0_1.test04.tDBTableList_1Process(test04.java:545)
at local_test.test04_0_1.test04.runJobInTOS(test04.java:3244)
at local_test.test04_0_1.test04.main(test04.java:3031)
[statistics] disconnected

Job test04 ended at 19:44 04/03/2020. [exit code=1]

I have check my SQLServer as well as tried using it on other job, and everything is working fine though...

I'm not sure why the connection is not doing well here....

 

 

For you reference

dbconnectiondbconnection

tablelisttablelist

Highlighted
Six Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

Check the tDBCommit/tDBRollBack and desable Close Connection 

Highlighted
Eight Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

It works perfectly!!
Thank you so much!!!
Highlighted
Eight Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

@JohnRMK 

 

Thanks for helping me out previously.

After comparing the input source with output, I noticed that the primary key fields (column) and primary composite key is not generated in the output database.

Is there a way to migrated primary key fields (column) and primary composite key together with the tables and its data(+schemas)? My tables have around 1~4 primary key fields (column).

 

Source Database: MSSQL

mssql-inputmssql-input

Output: MySQL

mysql-outputmysql-output

Highlighted
Six Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

Hello,
Like I said in my first post, the job I gave you is just for migrating data.

 

 

If you want to add the constraints of integrity, it will be necessary to develop a little but it remains easy.

 

Here is the approach to automatically migrate the keys.

 

First, you will request the database schema and list all the integrity constraints.


Here is the query to do that. https://dataedo.com/kb/query/sql-server/list-all-table-constraints

 

 

 

You can use tDBInput to execute the query like the image below.

 

2020-03-09_10h29_36.png

 

You recover the data in a file or SQL Server table then for each table you generate a query in order to update structures and add the keys.

 

2020-03-09_10h28_35.png

 

You will use tDBRow and in it you will have a request for style
"ALTER TABLE GLOBAL_VARIABLE ADD CONSTRAINT ........" you can use tJavaRow to perform the concatenation or a tMap and you can iterate on the different constraints with tFlowToIterate and you can use like the first job

 

-> tJava (blank) ==> On Subjub Ok ==> tDBRow (query to add)

 

In the example below, I use the same approach to create the tables and add a primary key. Me, all my tables have the Id columns as key so it's easy but you just have to modify the query a bit and it will be fine

 

2020-03-09_10h38_44.png

 

If you want a last tip, try to create the tables with the constraints then you migrate the data with the dynamic schema.
You can use a query  in order to recover the columns and their types (or combination of tDBTableList and tDBColumnList), you carry out a conversion to MySql type and you generate your requests with tMap or tJava. You execute the creation of the tables with the constraints then you migrate the data. you can do it in 3 jobs and you already have 2. It's the same process used in the job attached (migration from MySql to Snowflake)


Good luck

 

 

Highlighted
Eight Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

@JohnRMK 

Thanks for getting back to me.

 

Here I have tried creating job design as you mentioned above, however I'm not how it works here.

 

You will use tDBRow and in it you will have a request for style
"ALTER TABLE GLOBAL_VARIABLE ADD CONSTRAINT ........" you can use tJavaRow to perform the 
concatenation or a tMap and you can iterate on the different constraints with tFlowToIterate and
you can use like the first job -> tJava (blank) ==> On Subjub Ok ==> tDBRow (query to add)

I feel that this might not be the right one, but could please help point out how could I improve?

job-design & inputjob-design & inputtMap (schema)tMap (schema)

 

Next I want ask the statement for altering the table, i replace 'global_variable' with the global variable string but i'm not sure about what I should put for constraint??  

'PK_'global variable +PRIMARY KEYS(?)

In my case, my primary keys columns is usually about 1-4, and column also differs by table.

What should I put for primary keys? Is there something like a global variable for this?

 

**I'm still a novice in SQL

dbRowdbRow

Thank you!

Highlighted
Six Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

Did you check the job a give you ? 

 

What you are trying to do requires a little programming.
You collect your keys from a base table.
Then you group each table with these keys (you can use tAggregatRow ==> as output you must have <Table_1; Pk_1, Pk_2 ..>
Then in a tMap you add characters to form a request

"ALTER TABLE" + GLOBAL_VAR (Table name) + "ADD CONSTRAINT" + GLOBAL_VAR (List of table keys) + ";"

 

ALTER TABLE table_name (to replace with global variable)
  ADD CONSTRAINT table_PK_name (to replace with global variable)
    PRIMARY KEY (last_name, first_name); (to replace with global variable after aggregation)

2020-03-09_18h15_37.png

 

In the image below I use the same principle to add columns to a database.
I generate dynamic query that I inject into a tDBRow

 

I just developed the job for you. I only create the request for PK to you to develop the second part of the sub-job for foreign keys or indexes

 

2020-03-09_18h53_04.png
I can't do more than that you normally have.

Sorry, i made an error in the script (inside of tMap ==> ALTER Table not ALTER TABME)
(Add the connection to the SQL Server database and it will work without problem)

 

 

Highlighted
Eight Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

@JohnRMK 

Thank you for your clarification.

I was able to group each tables with keys as the following

 

dynamic PK's job designdynamic PK's job design

tMap

dynamic_pk mappingdynamic_pk mapping

 

Here's the result ,,,, somehow table with 'one key' has its one key repeated here ...

 

traveltour.tbl_mem|ALTER TABLE traveltour.tbl_mem ADD CONSTRAINT PK_tbl_mem,PK_tbl_mem PRIMARY (mem_id,mem_id);
Table Name = traveltour.tbl_mem 
 AND the Query is = ALTER TABLE traveltour.tbl_mem ADD CONSTRAINT PK_tbl_mem,PK_tbl_mem PRIMARY (mem_id,mem_id);

 

 

However, I still do have question about how I should set tDBRow?

 

Here is my DBRow's component and schema setting

DBRow's settingDBRow's setting

For schema I make is the same as tMap's output(query_build).

I used a global variable ((String)globalMap.get("tDBTableList_1_CURRENT_TABLE")) for table name.

As for query, I'm not sure if I could use a global variable here or used row name??? I kinda used both here ....

For PRIMARY KEY in query statement, I put it as 'Id' as I'm not sure how I could call it here, since I assume that I could not use 'rowname.details ' like I used it in tMap.

Would appreciate if you elaborate more on DBRow's Component and Schema (point mentioned above)

 

After making this work, I will move on to try indexes and foreign keys

 

Thank you so muchhhh!!

Highlighted
Six Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

The tDBRow component is used to execute query or functions. It does not need a schema and is not a read or write component. So, you just have to put the request generate (the one displayed in the System.out.prinln of tJava) in the code and use a connection to an existing database.

 

2020-03-10_10h18_12.png

Try to take the request generated in tLogRow and execute it in SQL Workbench to see if it works. A priori yes I tested it on SQL Server

 

For the result I have no idea, the code I gave you works on a database that I use. The error comes either from the way you created your keys in the source database (instead of grouping two columns as one key, you create two constraints with two different names) or something else. I don't really master databases

 

Here is an example of PK with multiple  values

Table Name = dbo.FactInternetSales 
 AND the Query is = 
ALTER TABLE dbo.FactInternetSales 
ADD CONSTRAINT PK_FactInternetSales_SalesOrderNumber_SalesOrderLineNumber 
PRIMARY KEY (SalesOrderNumber, SalesOrderLineNumber);
Highlighted
Eight Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

@JohnRMK 

Thanks for your prompt response.

I have change tDBRow Component as you mentioned, 

dbrowdbrow

 

However, the following errors occured.

 

dbo.DOM_TOUR_FOR_DA_ROUNDTRIP_SEARCH|ALTER TABLE dbo.DOM_TOUR_FOR_DA_ROUNDTRIP_SEARCH 
ADD CONSTRAINT PK_DOM_TOUR_FOR_DA_ROUNDTRIP_SEARCH,PK_DOM_TOUR_FOR_DA_ROUNDTRIP_SEARCH
PRIMARY (mem_id, tour_id, money_start_date, money_end_date,mem_id, tour_id, money_start_date, money_end_date); Table Name = dbo.DOM_TOUR_FOR_DA_ROUNDTRIP_SEARCH AND the Query is = ALTER TABLE dbo.DOM_TOUR_FOR_DA_ROUNDTRIP_SEARCH ADD CONSTRAINT
PK_DOM_TOUR_FOR_DA_ROUNDTRIP_SEARCH,PK_DOM_TOUR_FOR_DA_ROUNDTRIP_SEARCH PRIMARY
(mem_id, tour_id, money_start_date, money_end_date, mem_id, tour_id, money_start_date, money_end_date); [FATAL]: local_test.job_tbl_mem_0_1.Job_tbl_mem - tDBRow_1 You have an error in your SQL syntax;
check the manual that corresponds to your MySQL server version for the right syntax to use
near ',PK_DOM_TOUR_FOR_DA_ROUNDTRIP_SEARCH PRIMARY (mem_id, tour_id, money_start_date,' at line 1

 

When I traced the job, I found that from roll 6(aggregateRow => tMap), constraint name contained a duplicated 'PK___' values...

Does this something to do with the error above??

row6 tracerow6 trace

 

dynamic PK's job designdynamic PK's job design

 

About the repeating PK values, it currently happens in case of one PK columns,,

I haven't check in MYSQL Workbench yet, as I haven't finish this yet.

 

 

Highlighted
Six Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

You changed the code that I gave you
Normally, you must have .<...Unique_Name PRIMARY KEY (val1, val2)>
And, in the tAggregateRow, you have to aggregate in list the details and column name and group by the table name.

I can't do anything more, because I test on my computer and it works.

Highlighted
Six Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

As you can see in the picture, a single key name and a list of fields
That’s the way you create your keys maybe, sorry I don’t know what to do with a DBA

 

2020-03-10_16h11_17.png

Highlighted
Eight Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

@JohnRMK 

 

I just realized the problem was because of TableList component I used,

I thought it was necessary to iterate through list of table .....

 

There's another error related to DBRow.

It will summed up the points in the next post.

I went back to check my code, but I didn't change anything from what you gave me.

 

Below here, I tried using the job(project file) you sent over, but it didn't work either.

 

The only minor changes I made here is that I added a TableList component and connect to DBConnection and HashInput with OnSubJobOk, instead of DBInput, as I will need to iterate over list of tables.

job_pk-test.png

 

Does this have anything to do with the changes in constraint name?

 

From tracing, it seems like constraint_name got repeated twice from tAggregateRow (row5) but I'm didn't make any changes there. 

 

my aggregaterow  component(no change from your code)

aggregaterow-componentaggregaterow-component

Highlighted
Eight Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

@JohnRMK 

went back to check my code, but I didn't change anything from what you gave me.

 

The problems seems to be related to the TableList component I uses to iterate over list of tables.

 

Here's I have a few points to confirm.

In my cases I'm migrating from MSSQL => MySQL.

So after migrating database without keys.

The next step is to inject the keys into MySQL database(??), which is the code you gave me previously.

For DBinput I used the source (MSSQL) and database for DBRow and DBCommit I used a MySQL database.

 

I added MySQL DBConnection as the folllowing.

However an error 'unknown database' occurred. 

DbRowDbRow 

dbrow3.png

 

Is this because of the way I set up my job design/dbconnections??

Or I should named the output DB (MySQL) exactly the same as the input DB(MSSQL)?

 

Please let me know if anything is unclear.

 

Thank you very much

Highlighted
Eight Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

@cbma_1994  (I don't know why my reply was deleted a couple of times lol)

went back to check my code, but I didn't change anything from what you gave me.

 

The problems seems to be related to the TableList component I uses to iterate over list of tables.

 

Here's I have a few points to confirm.

In my cases I'm migrating from MSSQL => MySQL.

So after migrating database without keys.

The next step is to inject the keys into MySQL database(??), which is the code you gave me previously.

For DBinput I used the source (MSSQL) and database for DBRow and DBCommit I used a MySQL database.

 

I added MySQL DBConnection as the folllowing.

However an error 'unknown database' occurred. 

DbRowDbRowdbrow3.png

Is this because of the way I set up my job design/dbconnections??

Or I should named the output DB (MySQL) exactly the same as the input DB(MSSQL)?

 

Please let me know if anything is unclear.

 

Thank you very much

Highlighted
Six Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

2020-03-11_16h21_17.png

Highlighted
Eight Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

@JohnRMK 

Thanks for your reply.

It worked when I change the output database to the one with the exact same name.

ex) input DB with schema: 'dbo' ==> output database: 'dbo'

Though I still need to make some additional adjustment to make sure one schema is processed at a time.

*As my database contain two schemas, I have to use WHERE statement (table_view like 'dbo%') in DBInput when processing 'dbo' schema.

 

Here I just notice another problem, when I migrate a database with the following job design, I noticed that only the last table's data is not copied (only the table structure and column is migrated). I tried removing '+' but it doesn't work.

Do you have any idea how I can improve the query statement?

 

dbinput's componentdbinput's component

 

traveltour schemas has two tables: tbl_inq and tbl_mem, however only the first one (tbl_inq) has its data migrated while the second table (tbl_mem) only has a blank table.

*I tested this with other database and the result is the same. I also check the source db table, and there is a data.

 

first table(have data)first table(have data)second table(no data)second table(no data)

Thank you so muchh

Highlighted
Six Stars

Re: Whole Database Migration (MSSQL->MySQL) using Dynamic Schema

dbo is the name of the schema, not the database. It is hierarchical (Database (HR or Finance => schema (dbo or Fact, Dim) => tables (Person or Address))
The schema definition is done in the tDBConnection component and by default is dbo.

 

2020-03-12_11h09_54.png

2019 GARTNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

Best Practices for Using Context Variables with Talend – Part 2

Part 2 of a series on Context Variables

Blog

Best Practices for Using Context Variables with Talend – Part 1

Learn how to do cool things with Context Variables

Blog

Self-service Talend Migration: Moving from On-Premises to the Cloud

Move from On-Premises to the Cloud by following the advice of experts

Read Now