Hello all, I am new to the Talend TOS Big Data application. I am excited to think that this product could solve my problem! I could use some help. I am currently using PostgreSQL and have one database with 39 tables. I would like to migrate my single database to a MongoDB database. If possible, I would like to perform this task across my local network. I am unable to find any sort of tutorial or quick-guide on how to perform this task. Any hints, tips, suggestions, white papers or other assistance is greatly appreciated. I read a post in the forums that recommends using tPostgresqlInput and tMongoDBOutput to perform this operation. I am not sure if this is correct, and the post did not supply much more information. My operating system is Linux, the PostgreSQL version is 8.1, and I am using version 5.4.1 of TOS Big Data. Thank you in advance for the assistance!
Thank you pcoffre for your rapid response. The video was most helpful, however, I am still having some difficulties. I am using tPostgresqlConnection, tFixedFlowInput, tMongoDBOutput, tMongoDBInput, tMongoDBClose, and tLogRow in a manner similar to what I have seen in the video. When I run the job I receive errors. There are no errors in a Java Debug run or a Traces Debug run. The errors are in the tPostgresqlConnection and tMongoDBInput components. There is no indication of what the errors actually are. The schema I am using was entered into the FixedFlowInput component using information from the PostgreSQL table schema shown using the \d command. For example, in column, the first entry is "equipment", the type is bigint, and there are no modifiers. I added this using the "Edit Schema" button in a manner similar to that shown in the video. I am wondering if there is some other component that would take the PostgreSQL table and convert it to a MongoDB collection. Is there something I missed in the video or is there another approach I should take? Thank you again for the help.