Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I open files containing accents in Java?

(editing for clarification and adding some code)

Hello, We have a requirement to parse data sent from users all over the world. Our Linux systems have a default locale of en_US.UTF-8. However, we often receive files with diacritical marks in their names such as "special_á_ã_è_characters.doc". Though the OS can deal with these files fine, and an strace shows the OS passing the correct file name to the Java program, Java munges the names and throws a "file not found" io exception trying to open them.

This simple program can illustrate the issue:

import java.io.*;
import java.text.*;

public class load_i18n
{
  public static void main( String [] args ) {
    File actual = new File(".");
    for( File f : actual.listFiles()){
      System.out.println( f.getName() );
    }
  }
}

Running this program in a directory containing the file special_á_ã_è_characters.doc and the default US English locale gives:

special_�_�_�_characters.doc

Setting the language via export LANG=es_ES@UTF-8 prints out the filename correctly (but is an unacceptable solution since the entire system is now running in Spanish.) Explicitly setting the Locale inside the program like the following has no effect either. Below I've modified the program to a) attempt to open the file and b) print out the name in both ASCII and as a byte array when it fails to open the file:

import java.io.*;
import java.util.Locale;
import java.text.*;

public class load_i18n
{
  public static void main( String [] args ) {
    // Stream to read file
    FileInputStream fin;

    Locale locale = new Locale("es", "ES");
    Locale.setDefault(locale);
    File actual = new File(".");
    System.out.println(Locale.getDefault());
    for( File f : actual.listFiles()){
      try {
        fin = new FileInputStream (f.getName());
      }
      catch (IOException e){
        System.err.println ("Can't open the file " + f.getName() + ".  Printing as byte array.");
        byte[] textArray = f.getName().getBytes();
        for(byte b: textArray){
          System.err.print(b + " ");
        }
        System.err.println();
        System.exit(-1);
      }

      System.out.println( f.getName() );
    }
  }
}

This produces the output

es_ES
load_i18n.class
Can't open the file special_�_�_�_characters.doc.  Printing as byte array.
115 112 101 99 105 97 108 95 -17 -65 -67 95 -17 -65 -67 95 -17 -65 -67 95 99 104 97 114 97 99 116 101 114 115 46 100 111 99

This shows that the issue is NOT just an issue with console display as the same characters and their representations are output in byte or ASCII format. In fact, console display does work even when using LANG=en_US.UTF-8 for some utilities like bash's echo:

[mjuric@arrhchadm30 tmp]$ echo $LANG
en_US.UTF-8
[mjuric@arrhchadm30 tmp]$ echo *
load_i18n.class special_á_ã_è_characters.doc
[mjuric@arrhchadm30 tmp]$ ls
load_i18n.class  special_?_?_?_characters.doc
[mjuric@arrhchadm30 tmp]$

Is it possible to modify this code in such a way that when run under Linux with LANG=en_US.UTF-8, it reads the file name in such a way that it can be successfully opened?

like image 350
Mark Juric Avatar asked Jun 18 '10 18:06

Mark Juric


2 Answers

First, the character encoding used is not directly related to the locale. So changing the locale won't help much.

Second, the � is typical for the Unicode replacement character U+FFFD being printed in ISO-8859-1 instead of UTF-8. Here's an evidence:

System.out.println(new String("�".getBytes("UTF-8"), "ISO-8859-1")); // �

So there are two problems:

  1. Your JVM is reading those special characters as .
  2. Your console is using ISO-8859-1 to display characters.

For a Sun JVM, the VM argument -Dfile.encoding=UTF-8 should fix the first problem. The second problem is to be fixed in the console settings. If you're using for example Eclipse, you can change it in Window > Preferences > General > Workspace > Text File Encoding. Set it to UTF-8 as well.


Update: As per your update:

byte[] textArray = f.getName().getBytes();

That should have been the following to exclude influence of platform default encoding:

byte[] textArray = f.getName().getBytes("UTF-8");

If that still displays the same, then the problem lies deeper. What JVM exactly are you using? Do a java -version. As said before, the -Dfile.encoding argument is Sun JVM specific. Some Linux machines ships with GNU JVM or OpenJDK's JVM and this argument may then not work.

like image 137
BalusC Avatar answered Nov 09 '22 06:11

BalusC


It is a bug in JRE/JDK which exists for years.

How to fix java when if refused to open a file with special charater in filename?

File.exists() fails with unicode characters in name

I am now re-submitting a new bug report to them as LC_ALL=en_us will fix some cases, meanwhile it will fail some other cases.

like image 29
Dennis C Avatar answered Nov 09 '22 05:11

Dennis C