I have a Microsoft Excel (.xlsx) file which I would like to load into R. I've done this before and used read.csv(), which always worked fine, but with this file something strange happens. Something seems to go wrong with one column, which contains a row name and large numbers with values of 13 digits. This column is - no matter how I try to load the file into R - converted to its scientific notation.
The problem can be simulated as follows: in Excel, type in the first column first
row, a random row name. Type in the first column second row a random big number, e.g. 6345157890027. Then save the file as .csv. Let's suppose I want to open this file in R:
TestData <- read.csv(file = "Test.csv", head = TRUE) and then
View(TestData)
The number 6345157890027 is now displayed as 6.345158e+12, and information is lost somewhere during import. Now, I've tried to solve this multiple ways:
However, none of these solved the problem: the value always seems to appear in scientific notation. Does someone knows a way to make sure either Excel or R does not transform large numbers into scientific notation?
I solved this problem by using the format function, as proposed in another post by rnso (see How to prevent scientific notation in R?):
> xx = 100000000000
> xx
[1] 1e+11
> format(xx, scientific=F)
[1] "100000000000"
Also worked perfectly for my wrongly displayed column in my data frame
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With