I have a project written for Android devices. It generates a large number of files, each day. These are all text files and images. The app uses a database to reference these files.
The app is supposed to clear up these files after a little use (perhaps after a few days), but this process may or may not be working. This is not the subject of this question.
Due to a historic accident, the organization of the files are somewhat naive: everything is in the same directory; a .hidden
directory which contains a zero byte .nomedia
file to prevent the MediaScanner indexing it.
Today, I am seeing an error reported:
java.io.IOException: Cannot create: /sdcard/.hidden/file-4200.html
at java.io.File.createNewFile(File.java:1263)
Regarding the sdcard, I see it has plenty of storage left, but counting
$ cd /Volumes/NO_NAME/.hidden
$ ls | wc -w
9058
Deleting a number of files seems to have allowed the file creation for today to proceed.
Regrettably, I did not try touch
ing a new file to try and reproduce the error on a commandline; I also deleted several hundred files rather than a handful.
However, my question is:
Nota Bene: The SD card is as-is - i.e. I haven't formatted it, so I would guess it would be a FAT-* format.
The FAT-32 format has hard limits of filesize of 2GB (well above the filesizes I am dealing with) and a limit of number of files in the root directory. I am definitely not writing files in the root directory.
There's a limit on 512 entries in the root directory of FAT filesystem. This limit comes about because the root directory lives in a particular place on FAT filesystems.
For other directories this limit is not in place. Additionally, FAT32 removed the 512 entry limit for the root filesystem by treating the root directory the same as any other directory.
Using long filenames - i.e. not in 8.3 format - means than a single file uses multiple directory entries.
Some Googling finds some people claiming that a FAT32 directory can have a maximum of 65,536 entries (which would be fewer files if they had long file names). However, none of the sources that mentioned this limit seemed that reliable so I thought I'd test this.
I wrote a script which creates files with 30 character filenames, meaning each file would need 4 directory entries. When the script got to file 16,384 it failed with an IO error and I couldn't create more files in my test directoy. So this does seem to validate the 65,536 entry limit.
If you're hitting this limit at 9,000 files then your files must be using at least 7 entries each which corresponds to filenames that are at least 66 characters long. Does this match what you're doing? (Or you could have some short filenames and some very, very long ones, but you get the idea.)
I think the limit of files in a directory in Fat32 also depends on the length of the filenames
http://www.microsoft.com/whdc/system/platform/firmware/fatgen.mspx
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With