I have been working on a job for a little while that is intended to take an xml file filled with "items" and move those items into a MySQL database. The xml file and MySQL database are both on the same server, and the job is run using an exported run.sh script on the server as well. The xml file to be the input is set using a reference in the context file "Default.properties" The problem i'm having is this: the job runs perfectly when I use a sample xml file (it's a mini version of the one I plan on using, with about 4 "items") but when the full file (which includes a couple hundred thousand items) is used as the input, it doesn't seem to work. Now, it isn't giving me any error messages, but when I check the database periodically to see how many rows have been copied, the COUNT(*) query returns zero every time. If any one has any ideas as to why this is happening, or how i could better monitor what actually IS happening, please post. P.S. Here's some screenshots of the job (with private information missing of course - and no, those fields are not blank in the real job design).
I did do that, but the large xml file is still taking a long time and the count remains at zero. I actually split the file in half, and tried the job on both halves, and it worked fine, but when I try it on the file as a whole, it gets caught somewhere.
It seems the problem has been memory. The default limit java was sending as a parameter was 1 gig. I was running it on a virtual server with only a gig of memory, and we've moved it to one with four gigs. We also changed the limit to 2 gigs, but that seems like a very high requirement. (it hasn't yet broken that maximum, but it is indeed requiring over 1 gig). Anyone have any ideas on shaving off some of the memory usage?