I'm having trouble outputting a data.frame using write.csv
using UTF-16 character encoding.
Background: I am trying to write out a CSV file from a data.frame for use in Excel. Excel Mac 2011 seems to dislike UTF-8 (if I specify UTF-8 during text import, non-ASCII characters show up as underscores). I've been led to believe that Excel will be happy with UTF-16LE encoding.
Here's the example data.frame:
> foo
a b
1 á 羽
> Encoding(levels(foo$a))
[1] "UTF-8"
> Encoding(levels(foo$b))
[1] "UTF-8"
So I tried to output the data.frame by doing:
f <- file("foo.csv", encoding="UTF-16LE")
write.csv(foo, f)
This gives me an ASCII file that looks like:
"","
If I use encoding="UTF-16"
, I get a file that only contains the byte-order mark 0xFE 0xFF
.
If I use encoding="UTF-16BE"
, I get an empty file.
This is on a 64-bit version of R 2.12.2 on Mac OS X 10.6.6. What am I doing wrong?
You could simply save the csv in UTF-8 and later convert it to UTF-16LE with iconv in terminal.
If you insist on doing it in R, the following might work - althought it seems that iconv
in R does have some issues, see: http://tolstoy.newcastle.edu.au/R/e10/devel/10/06/0648.html
> x <- c("foo", "bar")
> iconv(x,"UTF-8","UTF-16LE")
Error in iconv(x, "UTF-8", "UTF-16LE") :
embedded nul in string: 'f\0o\0o\0'
As you can see the above linked patch is really needed - which I did not tested, but if you want to keep it simly (and nasty): just call the third party iconv program inside R with a system
call after saving the table to csv.
something like that might do (write.csv()
simply ignores the encoding so you have to opt for writLines()
or writeBin()
) ...
#' function to convert character vectors to UTF-8 encoding
#'
#' @param x the vector to be converted
#' @export
toUTF8 <-
function(x){
worker <- function(x){
iconv(x, from = Encoding(x), to = "UTF-8")
}
unlist(lapply(x, worker))
}
#' function to write csv files with UTF-8 characters (even under Windwos)
#' @param df data frame to be written to file
#' @param file file name / path where to put the data
#' @export
write_utf8_csv <-
function(df, file){
firstline <- paste( '"', names(df), '"', sep = "", collapse = " , ")
char_columns <- seq_along(df[1,])[sapply(df, class)=="character"]
for( i in char_columns){
df[,i] <- toUTF8(df[,i])
}
data <- apply(df, 1, function(x){paste('"', x,'"', sep = "",collapse = " , ")})
writeLines( c(firstline, data), file , useBytes = T)
}
#' function to read csv file with UTF-8 characters (even under Windwos) that
#' were created by write_U
#' @param df data frame to be written to file
#' @param file file name / path where to put the data
#' @export
read_utf8_csv <- function(file){
# reading data from file
content <- readLines(file, encoding = "UTF-8")
# extracting data
content <- stringr::str_split(content, " , ")
content <- lapply(content, stringr::str_replace_all, '"', "")
content_names <- content[[1]][content[[1]]!=""]
content <- content[seq_along(content)[-1]]
# putting it into data.frame
df <- data.frame(dummy=seq_along(content), stringsAsFactors = F)
for(name in content_names){
tmp <- sapply(content, `[[`, dim(df)[2])
Encoding(tmp) <- "UTF-8"
df[,name] <- tmp
}
df <- df[,-1]
# return
return(df)
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With