Java - AWS - too many files open

Question!

I'm using the java AWS SDK in order to download a big amount of files from one S3 bucket, edit the files, and copy them back to a different S3 bucket.

I think it's supposed to work fine, but there is one line that keeps throwing me exceptions:

when I use

    myClient.getObject(myGetObjectRequest, myFile)

I get an AmazonClientException saying there are too many files open.

Now, each time I download a file, edit it and copy it back to the bucket, I delete the temporary files I create. I'm assuming it's taking a few milliseconds to delete the file, and maybe that's why I'm getting these errors. Or is it maybe because of the open files on Amazon's side?

Anyway, I made my application sleep for 3 seconds each time it encounters this exception, that way it would have time to close the files, but that just takes too much time. Even if I'll take it down to 1 second.

Has anybody encountered this problem? What should I do?

Thanks



Answers
Do you actually call "myFile.close()" at some point?



This video can help you solving your question :)
By: admin