Backslash escaping generated by Talend throwing errors from Postgres

One Star

Backslash escaping generated by Talend throwing errors from Postgres

I'm running TOS (4.0.2.r43696) on an Ubuntu Linux 10.04 workstation. I've created a job that reads from one table (the table input was dragged from the metadata list, per the examples at, sorts on 2 columns, then outputs to a different table (table output put into the workspace in the same way).
When I run the job, this is the output I get:
DBD::Pg::st execute failed: ERROR:  syntax error at or near "\"
LINE 1: SELECT \"public\".\"merchant_batch_listing\".\"id\",
^ at /home/bbranch/src/TOS-All-r43696-V4.0.2/workspace/.Perl/ line 532.
can't execute input query at /home/bbranch/src/TOS-All-r43696-V4.0.2/workspace/.Perl/ line 532.

I see what it's doing (escaping the quotes passed to the query), but I don't see any settings in the metadata that seem to affect the outcome. I tried setting the string quoting character (to a single quote, and to an empty string), and it's still using double quotes and escaping them.
Is there a configuration setting I need to change in order to be able to run this without modifying the perl by hand?

Re: Backslash escaping generated by Talend throwing errors from Postgres

You should be able to edit the queries from within the GUI-- in the "component" tab of the dbInput Component. You can use all the niceness of Perl's quoting inside the component (qq{} q## etc...)
I've had issues with quoting and postgres DB's before. This is probably a bug in the tProgresInput component.
Very rarely do I let Talend generate the queries for me-- so its never been much more than a minor annoyance for me.
I would check the bugtracker and issue a bug if you cant find anyone thats already reported the issue.
edit: rock the Perl! we're in a minority here-- personally I prefer Perl based Talend. less bugs--faster development.
One Star

Re: Backslash escaping generated by Talend throwing errors from Postgres

Thanks for the tip, that got it running!
Now, I just need to get it to quit maxing out my RAM and getting TOS and various other apps OOM-killed a few minutes into the job!
Is it typically the best practice to have TOS generate the perl script, and run it on a box other than my workstation, or is there some good way to limit how much CPU/RAM is getting used by it? When I ran the job it pretty shortly had my load up around 9.75, and I couldn't do much of anything until the OOM killer woke up...

Re: Backslash escaping generated by Talend throwing errors from Postgres

unfortunately-- or fortunately depending on how you look at it, Talend allows you to use as much or as little memory as possible. It will happily allow you to build a job that will be a huge memory hog. You can usually build the same job and have it be small and efficient. Its all in how you do it.
a few things to keep in mind to keep memory usage down:
"use cursor" is your friend when processing billions of rows.
tMap is you're join of last resort. When using a tMap to join it loads the entire lookup to memory. There's a button inside the tMap that allows you to cache to disk, but a big join in a tMap will eat memory like crazy even with this option enabled. I always try to push as much of this work to the source DB as possible-- the data doesn't have to move very far and DB's are designed to quickly join data sets.

Get familiar with the tPerlFlex and tPerlRow. These components can help make your jobs streamlined and elegant. Because you're writing the code, you can easily control your memory usage -- this can be especially useful when sorting/grouping.
Think of Talend as more of an IDE than an ETL tool. It helps you write code for ETL-- but its still code, and you have to treat it that way.


Talend named a Leader.

Get your copy


Kickstart your first data integration and ETL projects.

Download now

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

Best Practices for Using Context Variables with Talend – Part 4

Pick up some tips and tricks with Context Variables


How Media Organizations Achieved Success with Data Integration

Learn how media organizations have achieved success with Data Integration


Definitive Guide to Data Quality

Create systems and workflow to manage clean data ingestion and data transformation.