Showing wrong number of rows in csv file if file got near 10k rows.

One Star

Showing wrong number of rows in csv file if file got near 10k rows.

Hi All,
i written a job where i get the data from our client server into a csv file and later loading that data into salesforce by using bulk component. But the probem is "if i get the CSV file which is having 10k+ or nearly 10k rows the job will execute but only 200+ or 300 rows are inserted/loaded into the system and its not even giving any error. So then i tried to insert data through apex data loader there also it shows file conains only 200/300 rows.finally i copied all the rows and pasted it into another file then it showsed 10K+ rows. i don't know what is the problem, could anybody give me solution for this?
Highlighted
One Star

Re: Showing wrong number of rows in csv file if file got near 10k rows.

Not without seeing the data.
One Star

Re: Showing wrong number of rows in csv file if file got near 10k rows.

yes without seeing data means i am getting that file in one subjob after subjob is completed i'm passing that file to another subjob where i'm using salesforce Bulk insert component.
One Star

Re: Showing wrong number of rows in csv file if file got near 10k rows.

Make sure the file utility where you are trying count number of rows is not in word wrap format.
Assuming you are working on Windows OS you can do following.
Open your data file with "Notepad", Format--> Word Wrap (Uncheck it) --> press CTRL + End --> press CTRL + G
That should show the number of rows(lines) we have in the flat file. I hope that is at par with the number of rows in database.
If it does not match then please come back to us with screenshots of the Job.
--
Regards,
Vinod
One Star

Re: Showing wrong number of rows in csv file if file got near 10k rows.

Not without seeing the data.

I would need to see the data to comment on why it doesn't work.
Perhaps you have duplicate keys - are you checking for rejects in the output.
One Star

Re: Showing wrong number of rows in csv file if file got near 10k rows.

Here is the image...
i don't know why it is not uploading the images but still i'll try to explain u to my level best
1.i'm getting a csv file with 10015 rows. if i open it it shows 10015 rows also.
2.i'm passing this file to second subjob where i'm removing the dupliacte data/rows.
3.now finally i got the csv file with unique values/rows. and i'm using this file to insert the data through Salesforce bulk component.
4.but only 215 rows are inserted and there is no rejected file and no error in the job.
5.this job works fine till 8-9k rows but for 10k its giving problem.
6.if i tried to insert the data though apex data loader using that final csv file it shows 8k+ rows but when i open the file it shows 10k+ rows. so i copied whole rows in that file to another file lets say to "newfile" and i tried to load the data using newfile through apex data loader then it showed 10k+ rows. we are getting the correct data but generated csv file is not a proper one. plz let me know wt is the problem?
in csv component i only checked include header and append thats it.
One Star

Re: Showing wrong number of rows in csv file if file got near 10k rows.

Hi,
I think you've forgot to embed the image, however if you are removing duplicates using tUniqRow then probably you can add Duplicate flow to another output file (tFileOutputDelimited) and can then check if you have all the data (Uniques + Duplicates)
e.g.
tRunJob --> tUniqRow---(Uniques) -->Salesforce bulk
|
|
(Duplicates)
|
|
v
tFileOutputDelimited(Rejected data)
Reg. the apex data loader, I am not very much familiar with it, but as per my understanding each tool can interpret the same file in different ways. Depending on the capability of the tool. Some tools are intelligent enough to ignore the special characters, enter characters that might come in between words, and some will just read it as two separate lines this is what I feel is the reason of having inconsistent number of rows when reading through different tools.
Please attach the job and if possible provide some dummy data so that issue can be addressed in better way.
--
Regards,
Vinod
One Star

Re: Showing wrong number of rows in csv file if file got near 10k rows.

Hi Vinod,
In my job i'm doing the same thing as u told means catching both the rejected and unique data.i'm not getting any duplicates values. unique file generated is having 10k+ rows. But its allowing me to insert only 200+ or 300+ rows through component and i'm not getting any error.And i tried to map all the rows to xls file then it showed 10k+ rows so i remapped it to csv file(because i only insert the data through csv file as that component takes only csv as input) then it shows 8750 rows i don't know wts going wrong? if u want the image plz let me know here i'm not able to add the img.