Hei I'm trying to read in pandas the csv file you can download from here (euribor rates I think you can imagine the reason I would like to have this file!). The file is a CSV file but it is somehow strangely oriented. If you import it in Excel file has the format
02/01/2012,03/01/2012,04/01/2012,,,,
1w 0.652,0.626,0.606,,,,
2w,0.738,0.716,0.700,,,,
act with first column going up to 12m (but I have give you the link where you can download a sample). I would like to read it in pandas but I'm not able to read it in the correct way. Pandas has a built-in function for reading csv files but somehow it expect to be row oriented rather than column oriented. What I would like to do is to obtain the information on the row labeled 3m and having the values and the date in order to plot the time variation of this index. But I can't handle this problem. I know I can read the data with
import pandas
data = pandas.io.read_csv("file.csv",parse_dates=True)
but it would work if the csv file would be somehow transpose. H
Step 1: In order to read rows in Python, First, we need to load the CSV file in one object. So to load the csv file into an object use open() method. Step 2: Create a reader object by passing the above-created file object to the reader function. Step 3: Use for loop on reader object to get each row.
A pandas dataframe has a .transpose()
method, but it doesn't like all the empty rows in this file. Here's how to get it cleaned up:
df = pandas.read_csv("hist_EURIBOR_2012.csv") # Read the file
df = df[:15] # Chop off the empty rows beyond 12m
df2 = df.transpose()
df2 = df2[:88] # Chop off what were empty columns (I guess you should increase 88 as more data is added.
Of course, you can chain these together:
df2 = pandas.read_csv("hist_EURIBOR_2012.csv")[:15].transpose()[:88]
Then df2['3m']
is the data you want, but the dates are still stored as strings. I'm not quite sure how to convert it to a DateIndex
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With