I have a file with many lines. I am reading each line, splitting each word/number and storing in a list. After this, I am trying to convert this list into a 1-column panda Dataframe.
However after running my code I am getting only one row full of lists. What I need is 1 column with variable number of rows with some value.
Here is the code snippet I wrote:
for line1 in file:
test_set=[]
test_set.append(next(file).split())
df1 = DataFrame({'test_set': [test_set]})
My output is something like:
test_set
0 [[1, 0, 0, 0, 0, 0, 1, 1, 1, 0]]
But what I want is:
test_set
0 1
1 0
2 0
3 0
4 0
5 0
6 1
7 1
8 1
9 0
Any suggestions what I'm doing wrong or how can I implement this? Thanks.
Input Data Sample snippet
id1 id2 id3 id4
0 1 0 1
1 1 0 0
id10 id5 id6 id7
1 1 0 1
1 0 0 1
.
.
.
Step #1: Load numpy and Pandas. Step #2: Create random data and use them to create a pandas dataframe. Step #3: Convert multiple lists into a single data frame, by creating a dictionary for each list with a name. Step #4: Then use Pandas dataframe into dict.
Method 1: Using T function This is known as the Transpose function, this will convert the list into a row. Here each value is stored in one column. Example: Python3.
You want this instead:
df1 = DataFrame({'test_set': test_set})
There is no need to wrap the list again in another list, by doing that you're effectively stating your df data is a list with a single element which is another list.
EDIT
looking at your input data you can just load it and then construct your df as a single column like so:
In [134]:
# load the data
import io
import pandas as pd
t="""id1 id2 id3 id4
0 1 0 1
1 1 0 0"""
df = pd.read_csv(io.StringIO(t), sep='\s+')
df
Out[134]:
id1 id2 id3 id4
0 0 1 0 1
1 1 1 0 0
Now transpose the df and perform a list comprehension, this will construct your lists and concatenate them using pd.concat
:
In [142]:
pd.concat([df.T[x] for x in df.T], ignore_index=True)
Out[142]:
0 0
1 1
2 0
3 1
4 1
5 1
6 0
7 0
dtype: int64
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With