Talend Routine Not working for POIFileSystem

One Star abc
One Star

Talend Routine Not working for POIFileSystem

I have an requirement to process .xlsx file which is password protected. I have used below code to 

POIFSFileSystem fileSystem = new POIFSFileSystem(new File(excelFilePath));

EncryptionInfo info = new EncryptionInfo(fileSystem);
Decryptor d = Decryptor.getInstance(info);

try {
if (!d.verifyPassword("xyz")) {
throw new RuntimeException("Unable to process: document is encrypted");

catch (GeneralSecurityException ex) {
throw new RuntimeException("Unable to process encrypted document", ex);

OPCPackage dataStream = OPCPackage.open (d.getDataStream(fileSystem)) ;


Getting error : 

The supplied data appears to be in the Office 2007+ XML. You are calling the part of POI that deals with OLE2 Office Documents. You need to call a different part of POI to process this data (eg XSSF instead of HSSF)


The same code works fine in Eclipse, But Not woking in Talend.

I have used below Jars in Eclipse and Talend








Please suggest if any Jar should be updated / added for the code to work in Talend 


// The package open is instantaneous, as it should be.

Tags (2)

Re: Talend Routine Not working for POIFileSystem


So far, talend don't handle passwords in the Excel components, so it's impossible to read a file protected which can only be accessed with a password.

We suppose that the Apache POI library supports the password encryption in various manner.

Here is a custom component tFileExcelWorkbookOpen written by talend community user and shared on talend exchange portal. It can support for password protected files.

Hope it will shed some light for you.

Best regards


Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.


Introduction to Talend Open Studio for Data Integration.

Definitive Guide to Data Integration

Practical steps to developing your data integration strategy.

Definitive Guide to Data Quality

Create systems and workflow to manage clean data ingestion and data transformation.