I have a log file that I tried to read in pandas with read_csv or read_table. I've got this example of results:
0 date=2015-09-17 time=21:05:35 duration=0 etc...
on 1 column.
I would like to split each row, take the names (like date, time, ...) and convert them to columns so I would get:
date time duration ...
0 2015-09-17 21:05:35 0
Thank you !
I know this is an old post, but I came across this same problem and found a solution. The error Expected n fields in line n, saw n
is probably due to each row having different number of columns. This method is also not good if the ordering of columns are different for each row. I wrote a sample code here which converts your log into json and then to pandas Dataframe.
import pandas as pd
import json
path='log_sample.log'
log_data=open(path,'r')
result={}
i=0
for line in log_data:
columns = line.split('') #or w/e you're delimiter/separator is
data={}
for c in columns:
key = c.split('=')[0]
value=c.split('=')[1]
data[key]=value
result[i]=data
i+=1
j=json.dumps(result)
df=pd.read_json(j, orient='index')
----- Editing answer to account for inconsistent spacing:
Not sure what the pythonic approach should be, but here's a method that could work.
Using OP's data sample as an example:
0 date=2015-09-17 time=21:05:35 duration=0
1 date=2015-09-17 time=21:05:36 duration=0
2 date=2015-09-17 time=21:05:37 duration=0
3 date=2015-09-17 time=21:05:38 duration=0
4 date=2015-09-17 time=21:05:39 duration=0
5 date=2015-09-17 time=21:05:40 duration=0
I loop through each line and split at the equals sign, then grab the desired text:
import pandas as pd
log_data = open('log_sample.txt', 'r')
split_list = []
for line in log_data:
thing1 = line.split('=')
#print(thing1)
date = thing1[1][:10]
time = thing1[2][:8]
dur = thing1[3]
split_list.append([date, time, dur])
df = pd.DataFrame(split_list, columns=['date', 'time', 'duration'])
df
----- First Answer:
As @jezrael mentions in the comments, you can leverage the "sep" argument within read_csv.
pd.read_csv('test.txt', sep=r'\\t', engine='python') #[1]
See:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With