I'm trying to read a column oriented csv file into R as a data frame.
the first line of the file is like so:
sDATE, sTIME,iGPS_ALT, ...
and then each additional line is a measurement:
4/10/2011,2:15,78, ...
when I try to read this into R, via
d = read.csv('filename')
I get a duplicate row.names error since R thinks that the first column of the data is the row names, and since all of the measurements were taken on the same day, the values in the first column do not change.
If I put in row.names = NULL
into the read.csv
call, I get an extraneous column d$row.names
which corresponds to the sDATE column, and everything is "shifted" one column down, so d$sDATE
would have 2:15
in it, not 4/10/2011
as needed.
If I open my csv in excel, do nothing and then save it, everything's cool. I have to process hundreds of these, so manually saving in excel is not something I want. If there's something programmatically I can do to preprocess these csv's in python or otherwise, that would be great.
Assigning the second argument, row. names , to be 1 indicates that the data file has row names, and which column number they are stored in.
While reading the CSV file, you can rename the column headers by using the names parameter. The names parameter takes the list of names of the column header. To avoid the old header being inferred as a row for the data frame, you can provide the header parameter which will override the old header names with new names.
The read. table() method is used to read data from files. These can be . csv files or .
names parameter in read_csv function is used to define column names. If you pass extra name in this list, it will add another new column with that name with NaN values. header=None is used to trim column names is already exists in CSV file.
read.csv
only assumes there are any row names if there are less values in the header than in the other rows. So somehow you are either missing a column name or have an extra column you don't want.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With