Facing slowness while loading data from MongoDB To Neo4j

Four Stars

Facing slowness while loading data from MongoDB To Neo4j

Hi,

I am trying to load data from MongoDB To Neo4j, but job is running extremely slow. It is just inserting 30K records in ~3 hrs, when trying to load only 1 column. I'm using tMongoDbinput and tNeo4jOutput component for it.

 

We have requirement to load 50 Millions records to Neo4j.

 

Can anyone please help ?

 

Regards,

Pragya

Community Manager

Re: Facing slowness while loading data from MongoDB To Neo4j

You need to work out where your bottleneck is. Try this.....

1) Run your extract query in another querying tool. How quickly does it return all data?

2) Remove the insert functionality and remove any data processing functionality (or deactivate it), then run the data query part of the job. What rate do you get?

3) Switch the data processing functionality on, but NOT the insert part. Run it. What rate do you get?

4) Compare the two rates above with the rate you have given us here. That should indicate where your bottleneck is.

 

Once you have your bottleneck, give us as much info about that section as possible (screenshots, configs, etc). 

Six Stars

Re: Facing slowness while loading data from MongoDB To Neo4j

hi,

u can use tMongoDBBulkLoad ,firstly get all the data into csv file and load it into neo4j

2019 GARNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

Put Massive Amounts of Data to Work

Learn how to make your data more available, reduce costs and cut your build time

Watch Now

How OTTO Utilizes Big Data to Deliver Personalized Experiences

Read about OTTO's experiences with Big Data and Personalized Experiences

Blog

Talend Integration with Databricks

Take a look at this video about Talend Integration with Databricks

Watch Now