Hello, my OS is Windows 2000 SP4, I use TOS Version 3.0 and the generated language is Java. I use a mySQL database, I need to execute up to thousands of insert statements, which are delivered in several files. The problem is, that there doesn't seem to be a component, which takes a number of SQL commands rather than data from Strings, and then executes command by command. Is there a solution to this problem ?
Put your commands - one to a line - into a text file. INSERT INTO TABLENAME VALUES (1, 'George Smith', '312 Your Street', 'Peanutville, NB') INSERT INTO TABLENAME VALUES (18, 'Sam Van Goh', '2 Scenic View Dr', 'Yada, CA') INSERT INTO TABLENAME VALUES (3, 'Howard Duck', '18 Cotton Picking Lane', 'Flower Hills, NY') Schema for the Input File will be SQLCommand Read the text file and send the commands to (in your case a tMySQLRow) via a Main link. The Query for the tMySQLRow will be: row1.SQLCommand
Hi, It's better to replace tMySQLRow to the tMySQLOutput or tMySQLBulk.* components. tMySQLOutput component will generate automatically a proper SQL to insert data from your file. If you have several FILES, take a look in DemoProject (embedded in the product) or the documentation on tFileList components and Iterate link. Best regards;
I'm pretty sure that he has a Delimited or Excel file. I agree Volker, you have to use a tFileInputDelimited or tFileInputExcel in Input to read data from a file and put the stream into a tMySQLoutput or Bulk components. Best regards;
In my case, I regularly transfer data between two different databases (Informix and DB2) on two separate machines. The tables are named the same and have the same structures on each of the systems. I simply generate the statements - such as in my previous post - and execute them. It works fine. I, first thought I would have to write a custom job for each table. But with 800+ tables, it would have never been able to be maintained. With my method, I have ONE set of jobs do the work for all tables. In this way, TOS never has to be concerned with the structure of the data stream. It just passes on the command to the database and lets the database do all the work. Maybe this only works w/DB2 though. I haven't tried it on any other databases yet.