One Star

java.io.IOException: DataStreamer Exception:

Hi -
I have TOS DB 6.1.0 installed on my local machine and connecting to Hadoop Cloudera sitting on another netweork machine. When trying HDFSOutput component - the target file being created with 0 bytes with no actual data and throwing below DataStream error. Could you please help me to rectify this.
Starting job FirstHDFS_Kiran at 17:10 30/12/2015.
connecting to socket on port 3777
connected
: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in component tHDFSOutput_1
java.io.IOException: DataStreamer Exception:
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:708)
Caused by: java.nio.channels.UnresolvedAddressException
 at sun.nio.ch.Net.checkAddress(Unknown Source)
 at sun.nio.ch.SocketChannelImpl.connect(Unknown Source)
 at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
 at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1622)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1420)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1373)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:600)
: org.apache.hadoop.hdfs.DFSClient - DataStreamer Exception
java.nio.channels.UnresolvedAddressException
 at sun.nio.ch.Net.checkAddress(Unknown Source)
 at sun.nio.ch.SocketChannelImpl.connect(Unknown Source)
 at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
 at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1622)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1420)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1373)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:600)
disconnected
: org.apache.hadoop.hdfs.DFSClient - Failed to close inode 43158
java.io.IOException: DataStreamer Exception:
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:708)
Caused by: java.nio.channels.UnresolvedAddressException
 at sun.nio.ch.Net.checkAddress(Unknown Source)
 at sun.nio.ch.SocketChannelImpl.connect(Unknown Source)
 at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
 at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1622)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1420)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1373)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:600)
Job FirstHDFS_Kiran ended at 17:10 30/12/2015.
Kiran G.

29 REPLIES
Moderator

Re: java.io.IOException: DataStreamer Exception:

Hi,
Did you use the hostname of your namenode in your HDFSOutput component? Could you please show us your HDFSOutput component setting screenshot?
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
One Star

Re: java.io.IOException: DataStreamer Exception:

Is there any update on this? I am also getting same error with HDFSPut component. File is getting created, and block size is also allocated (as 128MB) and Replication shows 3. However, file size is displayed as 0 bytes. 
Moderator

Re: java.io.IOException: DataStreamer Exception:

Hi,
 
On which build version you got this issue? Could you please show us your HDFSOutput component setting screenshot?
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
One Star

Re: java.io.IOException: DataStreamer Exception:

Hi,
Talend Version: TOS_BD-20151214_1327-V6.1.1
Screenshot of tHDFSPut component setting below.
Moderator

Re: java.io.IOException: DataStreamer Exception:

Hi padmajauk,
Could you please show us your HDFS connection also? Localhost or server?
Can you confirm that the machine you are running the job on can access your HDFS connection successfully?
Is there any firewall issue on your end?
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
One Star

Re: java.io.IOException: DataStreamer Exception:

Hi,
I have Apache Hadoop (pseudo- distributed setup) on Amazon EC2 instance, which I am accessing via Talend (from my machine)
tHDFSConnection works fine. Even the tHDFSPut component is able to create the file in HDFS, just that the file content is not copied over (size is 0 bytes). Also, within tHDFSPut component settings, I can browse for HDFS Directory and see all the files and folders in HDFS.
Does putting file content require any additional ports to be open? 
tHDFSConnection setting screenshot below:
Moderator

Re: java.io.IOException: DataStreamer Exception:

Hi,
Could you please make sure all ports have been opened in the firewall, and can you make a telnet to that port successfully? Does this issue only repro on hello2.txt file or all your files?
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
One Star

Re: java.io.IOException: DataStreamer Exception:

Hi,
I can successfully telnet to the EC2 IP on port 9000 from my machine. Is there any other port that I need to open up?
This issue is reproducible on all the files
Thanks,
Padmaja
Moderator

Re: java.io.IOException: DataStreamer Exception:

Hi,
Can you successfully retrieve data from your HDSF server? Could you please verify that your cluster is up and runs correctly? Are you able to browse the webpage:
http://your-namenode-server-ip:50070 and confirm that you have one live node at least?
Have you tried port 8020?

Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Moderator

Re: java.io.IOException: DataStreamer Exception:

Hi,
Can you successfully retrieve data from your HDSF server?
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
One Star

Re: java.io.IOException: DataStreamer Exception:

Hi,
Yes, i can create and retrieve files successfully using hadoop commands. However, through tHDFSPut component, I am not able to create files (file gets created with 0 bytes). Exception is:
Exception in component tHDFSPut_1
: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in component tHDFSPut_1
java.io.IOException: DataStreamer Exception: 
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:578)
Caused by: java.nio.channels.UnresolvedAddressException
at sun.nio.ch.Net.checkAddress(Unknown Source)
at sun.nio.ch.SocketChannelImpl.connect(Unknown Source)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1577)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1318)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1271)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:464)
: org.apache.hadoop.hdfs.DFSClient - DataStreamer Exception
java.nio.channels.UnresolvedAddressException
at sun.nio.ch.Net.checkAddress(Unknown Source)
at sun.nio.ch.SocketChannelImpl.connect(Unknown Source)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1577)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1318)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1271)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:464)
Regards,
Padmaja
Moderator

Re: java.io.IOException: DataStreamer Exception:

Hi,
In your tHDFSConnection, could you please use the hostname instead of the IP address for your namenode?
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
One Star

Re: java.io.IOException: DataStreamer Exception:

Hi,
I tried that but didn't help.
By the way, I exported the Talend job to EC2 instance and executed it using the shell script which worked fine! Is it that the job is utilizing native hadoop library which works fine? When tried from my Windows machine, its trying to use built-in java classes which could have some bugs in it?
Regards,
Padmaja
Four Stars

Re: java.io.IOException: DataStreamer Exception:

Hi,
I have the same issue using talend BOS 6.1.1 tHDFSOutput with HortonWorks 2.3.0 sandbox that is running in a VM. An empty file is created, instead of a file with 100 rows.
When I upload a file manually it works fine.
Is there a way to resolve that ?

Re: java.io.IOException: DataStreamer Exception:

Hi
Did anyone get the resolution for the empty file creation problem . I do have talend running in windows and Hadoop running in VM. While running  hadoop job for importing file to hadoop , it created a empty file with no content 
Moderator

Re: java.io.IOException: DataStreamer Exception:

Hi,
Have you tried to delete these empty files by using component tfiledelete to see if it works?
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
One Star

Re: java.io.IOException: DataStreamer Exception:

Is the issue resolved. Facing similar issue and seen many post with similar issue. But no post contains the steps to resolve. I am using Talend Open Studio for Big Data (6.1) 
We are getting error while copying a file from local directory to HDFS location. I have mentioned the error below. Tried to upload a local file using tHDFSPut component. But we encountered an error every time. Basically we are not able to connect properly HDFS location.
The error log.
Starting job talend_job_2 at 17:58 10/05/2016.
 
connecting to socket on port 3846
connected
: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
: org.apache.hadoop.hdfs.DFSClient - DataStreamer Exception
java.nio.channels.UnresolvedAddressException
      at sun.nio.ch.Net.checkAddress(Unknown Source)
      at sun.nio.ch.SocketChannelImpl.connect(Unknown Source)
      at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
      at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
      at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1622)
      at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1420)
      at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1373)
      at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:594)
Exception in component tHDFSPut_1
java.io.IOException: DataStreamer Exception:
      at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:708)
Caused by: java.nio.channels.UnresolvedAddressException
      at sun.nio.ch.Net.checkAddress(Unknown Source)
      at sun.nio.ch.SocketChannelImpl.connect(Unknown Source)
      at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
      at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
      at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1622)
      at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1420)
      at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1373)
      at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:594)
disconnected
Job talend_job_2 ended at 17:58 10/05/2016.  
Moderator

Re: java.io.IOException: DataStreamer Exception:

Hi mSutrishna,


Could you please show us your HDFS connection also? Localhost or server?


Can you confirm that the machine you are running the job on can access your HDFS connection successfully?


Best regards


Sabrina

--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Twelve Stars

Re: java.io.IOException: DataStreamer Exception:

I suspect that this might be the cause of your issue....
https://www.talendforge.org/forum/viewtopic.php?id=50662

Rilhia Solutions

Re: java.io.IOException: DataStreamer Exception:

A few more detail on this same issue . 
1) I have all the hostnames and IP maped correctly in my windows and Centos El6 ( on which cdh 5.5.1 is running ) 
2) hosts entries in the windows i also mapped so that i could ping the vms running hadoop by hostname and ip 
3) i have my ip of  hadoop node as 192.168.142.156 ( provided by vmware workstation)   and my client windows box as 192.168.1.37 .All machines are able to ping each other by name and ip 
4) configured the hdfs,yarn,mapred.core xml with ip address 192.168.142.156 ( This is a one node pseudo cluster ) . Have all the xmls configured properly 
5) from talend check service status for the resource manager and namenode works successefuly 
6) in the talend while running job by checked the option datanode host name check . I get the error that logs stating  "/data/testing cannot be replicated to 0 node instead of minimum 1 node .there are 1 datanodes running and 1 excluded from the operation" .An empty /data/testing file is created 
 Now in talend i uncheck the datanode host check option,deleted the empty "/data/testing" file in the hadoop  . The ran the job in talend , I get org.apache.hadoop.fs.FileAlreadyExistsException: /data/testing for client 192.168.142.1 already exists next time when i run the job as this empty file is already created in the HDFS  An empty /data/testing file is created . Dont know why it states that /data/testing already exists as i have deleted that file before running job . It it something like mutiple mapper or reducer is trying to do this and we are getting this error

also ran the job by uncehcking the same option . 
One Star

Re: java.io.IOException: DataStreamer Exception:

Add an entry in hosts file for the IP address of hadoop namenode you are trying to connect and use hostname instead of IP address in HDFS connection parameter.
Place of hosts file in windows is open Win+R and type %systemroot%\system32\drivers\etc\hosts and make an entry of IP followed by hostname.
ex: 192.168.1.1 hostname
One Star

Re: java.io.IOException: DataStreamer Exception:

Hi, 
I am using 6.2 version and facing the following error. 
Exception in component tHDFSOutput_1
java.io.IOException: No FileSystem for scheme: https
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2644)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2651)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2687)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:160)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:157)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:157)
at mydemo.incremental_load_0_1.incremental_load.tOracleInput_1Process(incremental_load.java:409)
at mydemo.incremental_load_0_1.incremental_load.runJobInTOS(incremental_load.java:827)
Six Stars

Re: java.io.IOException: DataStreamer Exception:

I was facing same issue, after lot of research, above solution i.e. adding IP to host file, as suggested above resolved my issue.
Thanks Murali_24.

Re: java.io.IOException: DataStreamer Exception:

Hi, I'm using TOS 6.2 on Windows and Hortonworks HDP 2.4 Sandbox on Azure
I'm facing the same issue, tried with IP-Hostname in host file but no luck, any other suggestions?
Exception in component tHDFSOutput_1
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /out.csv could only be replicated to 0 nodes instead of minReplication (=1).  There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
: org.apache.hadoop.hdfs.DFSClient - DataStreamer Exception
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /out.csv could only be replicated to 0 nodes instead of minReplication (=1).  There are 1 datanode(s) running and 1 node(s) are excluded in this operation.

: org.apache.hadoop.hdfs.DFSClient - Failed to close inode 24607
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /out.csv could only be replicated to 0 nodes instead of minReplication (=1).  There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
Moderator

Re: java.io.IOException: DataStreamer Exception:

Hi sandiphs@gmail.com,
Is your network Ok with you?  Have you already edited your cluster host file and your client host file to connect your server?
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
One Star

Re: java.io.IOException: DataStreamer Exception:

please, can you help me, i have this erreur when i execute my job
Exception in component tMap_1_TMAP_OUT
java.lang.RuntimeException: java.io.IOException
at pfe_project.dimension1_0_1.DIMENSION1$beneassStruct.readData(DIMENSION1.java:16977)
at org.talend.designer.components.lookup.persistent.PersistentLookupManager.next(PersistentLookupManager.java:157)
at pfe_project.dimension1_0_1.DIMENSION1.tOracleInput_5Process(DIMENSION1.java:14324)
at pfe_project.dimension1_0_1.DIMENSION1.runJobInTOS(DIMENSION1.java:20320)
at pfe_project.dimension1_0_1.DIMENSION1.main(DIMENSION1.java:20177)
Caused by: java.io.IOException
at org.jboss.serial.persister.RegularObjectPersister.readSlotWithMethod(RegularObjectPersister.java:107)
at org.jboss.serial.persister.RegularObjectPersister.defaultRead(RegularObjectPersister.java:269)
at org.jboss.serial.persister.RegularObjectPersister.readData(RegularObjectPersister.java:241)
at org.jboss.serial.objectmetamodel.ObjectDescriptorFactory.readObjectDescriptionFromStreaming(ObjectDescriptorFactory.java:412)
at org.jboss.serial.objectmetamodel.ObjectDescriptorFactory.objectFromDescription(ObjectDescriptorFactory.java:82)
disconnected
at org.jboss.serial.objectmetamodel.DataContainer$DataContainerDirectInput.readObject(DataContainer.java:643)
at org.jboss.serial.io.JBossObjectInputStream.readObjectOverride(JBossObjectInputStream.java:163)
at java.io.ObjectInputStream.readObject(Unknown Source)
at pfe_project.dimension1_0_1.DIMENSION1$beneassStruct.readData(DIMENSION1.java:16966)
... 4 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.jboss.serial.persister.RegularObjectPersister.readSlotWithMethod(RegularObjectPersister.java:103)
... 12 more
Caused by: java.lang.OutOfMemoryError: Java heap space
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Unknown Source)
at java.lang.Class.getDeclaredField(Unknown Source)
at org.jboss.serial.references.FieldPersistentReference.rebuildReference(FieldPersistentReference.java:46)
at org.jboss.serial.references.PersistentReference.get(PersistentReference.java:91)
at org.jboss.serial.classmetamodel.ClassMetadataField.getField(ClassMetadataField.java:62)
at org.jboss.serial.persister.RegularObjectPersister.readSlotWithFields(RegularObjectPersister.java:308)
at org.jboss.serial.persister.ObjectInputStreamProxy.defaultReadObject(ObjectInputStreamProxy.java:78)
at java.math.BigDecimal.readObject(Unknown Source)
... 16 more
One Star

Re: java.io.IOException: DataStreamer Exception:

please, can you help me, i have this erreur when i execute my job
Exception in component tMap_1_TMAP_OUT
java.lang.RuntimeException: java.io.IOException
at pfe_project.dimension1_0_1.DIMENSION1$beneassStruct.readData(DIMENSION1.java:16977)
at org.talend.designer.components.lookup.persistent.PersistentLookupManager.next(PersistentLookupManager.java:157)
at pfe_project.dimension1_0_1.DIMENSION1.tOracleInput_5Process(DIMENSION1.java:14324)
at pfe_project.dimension1_0_1.DIMENSION1.runJobInTOS(DIMENSION1.java:20320)
at pfe_project.dimension1_0_1.DIMENSION1.main(DIMENSION1.java:20177)
Caused by: java.io.IOException
at org.jboss.serial.persister.RegularObjectPersister.readSlotWithMethod(RegularObjectPersister.java:107)
at org.jboss.serial.persister.RegularObjectPersister.defaultRead(RegularObjectPersister.java:269)
at org.jboss.serial.persister.RegularObjectPersister.readData(RegularObjectPersister.java:241)
at org.jboss.serial.objectmetamodel.ObjectDescriptorFactory.readObjectDescriptionFromStreaming(ObjectDescriptorFactory.java:412)
at org.jboss.serial.objectmetamodel.ObjectDescriptorFactory.objectFromDescription(ObjectDescriptorFactory.java:82)
disconnected
at org.jboss.serial.objectmetamodel.DataContainer$DataContainerDirectInput.readObject(DataContainer.java:643)
at org.jboss.serial.io.JBossObjectInputStream.readObjectOverride(JBossObjectInputStream.java:163)
at java.io.ObjectInputStream.readObject(Unknown Source)
at pfe_project.dimension1_0_1.DIMENSION1$beneassStruct.readData(DIMENSION1.java:16966)
... 4 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.jboss.serial.persister.RegularObjectPersister.readSlotWithMethod(RegularObjectPersister.java:103)
... 12 more
Caused by: java.lang.OutOfMemoryError: Java heap space
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Unknown Source)
at java.lang.Class.getDeclaredField(Unknown Source)
at org.jboss.serial.references.FieldPersistentReference.rebuildReference(FieldPersistentReference.java:46)
at org.jboss.serial.references.PersistentReference.get(PersistentReference.java:91)
at org.jboss.serial.classmetamodel.ClassMetadataField.getField(ClassMetadataField.java:62)
at org.jboss.serial.persister.RegularObjectPersister.readSlotWithFields(RegularObjectPersister.java:308)
at org.jboss.serial.persister.ObjectInputStreamProxy.defaultReadObject(ObjectInputStreamProxy.java:78)
at java.math.BigDecimal.readObject(Unknown Source)

... 16 more
One Star

Re: java.io.IOException: DataStreamer Exception:

please, can you help me, i have this erreur when i execute my job
Exception in component tMap_1_TMAP_OUT
java.lang.RuntimeException: java.io.IOException
at pfe_project.dimension1_0_1.DIMENSION1$beneassStruct.readData(DIMENSION1.java:16977)
at org.talend.designer.components.lookup.persistent.PersistentLookupManager.next(PersistentLookupManager.java:157)
at pfe_project.dimension1_0_1.DIMENSION1.tOracleInput_5Process(DIMENSION1.java:14324)
at pfe_project.dimension1_0_1.DIMENSION1.runJobInTOS(DIMENSION1.java:20320)
at pfe_project.dimension1_0_1.DIMENSION1.main(DIMENSION1.java:20177)
Caused by: java.io.IOException
at org.jboss.serial.persister.RegularObjectPersister.readSlotWithMethod(RegularObjectPersister.java:107)
at org.jboss.serial.persister.RegularObjectPersister.defaultRead(RegularObjectPersister.java:269)
at org.jboss.serial.persister.RegularObjectPersister.readData(RegularObjectPersister.java:241)
at org.jboss.serial.objectmetamodel.ObjectDescriptorFactory.readObjectDescriptionFromStreaming(ObjectDescriptorFactory.java:412)
at org.jboss.serial.objectmetamodel.ObjectDescriptorFactory.objectFromDescription(ObjectDescriptorFactory.java:82)
disconnected
at org.jboss.serial.objectmetamodel.DataContainer$DataContainerDirectInput.readObject(DataContainer.java:643)
at org.jboss.serial.io.JBossObjectInputStream.readObjectOverride(JBossObjectInputStream.java:163)
at java.io.ObjectInputStream.readObject(Unknown Source)
at pfe_project.dimension1_0_1.DIMENSION1$beneassStruct.readData(DIMENSION1.java:16966)
... 4 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.jboss.serial.persister.RegularObjectPersister.readSlotWithMethod(RegularObjectPersister.java:103)
... 12 more
Caused by: java.lang.OutOfMemoryError: Java heap space
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Unknown Source)
at java.lang.Class.getDeclaredField(Unknown Source)
at org.jboss.serial.references.FieldPersistentReference.rebuildReference(FieldPersistentReference.java:46)
at org.jboss.serial.references.PersistentReference.get(PersistentReference.java:91)
at org.jboss.serial.classmetamodel.ClassMetadataField.getField(ClassMetadataField.java:62)
at org.jboss.serial.persister.RegularObjectPersister.readSlotWithFields(RegularObjectPersister.java:308)
at org.jboss.serial.persister.ObjectInputStreamProxy.defaultReadObject(ObjectInputStreamProxy.java:78)
at java.math.BigDecimal.readObject(Unknown Source)

... 16 more
Four Stars

Re: java.io.IOException: DataStreamer Exception:

Thanks @rhall_2_0. It works for me.