Hi All, I can replicate this error message, to me its not an issue. You have this behaviour when you specify in the bucket name field the bucketName+Sub Folder where you have the file to process. To avoid this error you just need to use the key prefix field to specify the file to process, or just put the sub folder name where you have your files to prrocess. To me it is more design stuff than bug, there is lot of way to go over this error message. See the screenshots of the configuration that i use in my tS3List component: List all the file in folder or filter on one file
Has anyone resolved this? It appears that Amazon S3 limits the "GET" operation of a bucket to 1,000. It's unclear (at least in the link below) if the "max-keys" parameter can be set to an amount greater than 1,000. Another option would be to set a "marker" which would start the next search at the 1,001 object in alphabetical order.