Hi Forum, i have created a job that needed to connect database and inserted output to database postgre. When my job processed some files, i have no problem, but when i processed many files ( more than 10000) files, problems come. Process took long time and connectionpostgre is max. I think every time my job run, it always open connection postgre so that the connection is always max and took long time to insert my output to database. I heard about connection pool in postgre for talend, but i dont know much about that. Can someone tell me for the solution. Thank you
Hi erbi, actually it is not really necessary to have a connection pool as long as you close them correctly. You should not reach the limits. Take care your jobs do not simply opens tPostgresqlConnection and do not close them. I am totally aware a connection pool would be a good idea because it saves a lot of useless actions (connect, disconnect). I have created such a pool component already for MySQL and will create a fork also for PostgreSQL. One word to the shared connection. This is a share of exactly THIS particular connection and works only in the same thread (it does not work if you use parallelization!) and is therefore quite different compared to a real pool.