Connections link failure Error due to MySQL DB time out?

One Star

Connections link failure Error due to MySQL DB time out?

One of my (larger) jobs which have previously worked fine, has now started to give me "Connections Link Failure".
After reading some really old forum posts and looking again at my job I'm starting to wonder if this could be the cause:
The bulk loading of the sub-job before where/when the error occurs takes longer than my MySQL's time out limit.
(See attached screen shot). My MySQL is set to default timeout of 28 800.
How can I confirm or disprove this?
-- update --
I ran a version of the sub-job before the error, where I substituted the bulk output with a regular tMysqlOutput and the job ran slooow, but without errors.
Might be that the bulk file prep in the initial sub-job times out the connection.
But the MySQL is set for 28 800, and should manage.
Is there other time outs?
In Talend? Or in MySQL?
-- update 2 --
Adjusted the job by dropping indexes before bulk upload and did not get the "Connections link failure Error".
A working hypothesis is that dropping indexes made the bulk upload faster and hence keeping connection.
But what happens when source table grows past the gain from the index drop.
Any ideas_
One Star

Re: Connections link failure Error due to MySQL DB time out?

Hi
Try to put tMysqlCommit after bulk load component.
Sometimes this is caused by unclosed mysql connection.
Regards,
Pedro
One Star

Re: Connections link failure Error due to MySQL DB time out?

Thanks,
I tried it, but I think the MysqlCommit closes connection before the previous component is done.
Let me explain.
My job has several (2+3) inserts and updates to one table.
Two of them is done as Bulk uploads, and in those sub-jobs I drop indexes before the bulk is uploaded and then I add the indexes back on:

... -> tMysqlOutputBulk -> Drop Indexes (tMysqlRow) -> tMysqlBulkExec -> Add indexes (tMysqlRow) -> tMysqlCommit

The second sub-job with this structure breaks with an error: Can't DROP 'UUID-lookup'; check that column/key exists
And correctly assessed: the two indexes did not get added in the previous sub-job.
Could it be that the MySQLCommit closes the connection before "Add User_Dim indexes" has completed?
How do get around this?
Change the commit-setting on Add indexes (tMysqlRow)?
One Star

Re: Connections link failure Error due to MySQL DB time out?

Hi
Have you put a tMysqlConnection in the beginning of this job?
Have all these tMysql components checked "Use an existing connection"?
Regards,
Pedro
One Star

Re: Connections link failure Error due to MySQL DB time out?

Hi,
Thanks.
No that was not done in the original job, but that has been fixed.
Currently the sub-job looks like:
(and all using the same connection)

MySQLConnect --onCompOk-->MySQL Input -> stuff & 2tMaps -> MysqlOut
|
onSubJOk
|
MySQLInput -> stuff & tMap
|
tMySQLOutputBulk -onComOk->MySQLRow (Drop Indexs) --

-OnCompOK--> MySQLBuLkExec --OnCompOK--> MySQlRow (Add Indexs) ->MySQLCommit.

The Bulk-file gets created, filled and BuldExec uploads the records,
but this gives me the error:
Can't DROP 'UUID-lookup'; check that column/key exists
- The actual Drop happens before the load (see above), but the component is set to NOT die on error.
- After the job is done both indexes exists on the table.
- if I run the Alter-query that is supposed to drop indexes in an SQL-client it works fine.
- And since both indexes are set after the job I assume the add query is fine.

Any ideas what can be causing this error?

---Update---
When running the same job with a limitation of records (10% =2.2 M) the job completed ok.
One Star

Re: Connections link failure Error due to MySQL DB time out?

Hi
This issue has been raised many times and even in JIRA there is no solution.
As you say, all connection settings are correct and it's not behind a firewall.
So it seems that it is related to the job design itself or mysql server.
Could you tell us where you run this job at local machine or Jobserver?
Please send me an email and attach this export job. Hope I can reproduce this issue and find reason.
Regards,
Pedro