import pandas as pd
import dateutil
# Load data from csv file
data = pd.DataFrame.from_csv('phone_data.csv')
# Convert date from string to date times
data['date'] = data['date'].apply(dateutil.parser.parse, dayfirst=True)
The above code causes the error: "module 'pandas' has no attribute 'DataFrame'"
I'm new to Python and am attempting to use this tutorial: Summarising, Aggregating, and Grouping data in Python Pandas
Any suggestions on what could be causing the error? I've noticed others have had the same question, but the proposed solutions don't seem to apply in my case.
Conclusion # The Python "AttributeError module 'pandas' has no attribute 'DataFrame'" occurs when we have a local file named pandas.py or misspell DataFrame . To solve the error, make sure to rename any local files named pandas.py .
Fix error while creating the dataframe If we use dataframe it will throw an error because there is no dataframe attribute in pandas. The method is DataFrame(). We need to pass any dictionary as an argument. Since the dictionary has a key, value pairs we can pass it as an argument.
Pandas is an open source library in Python. It provides ready to use high-performance data structures and data analysis tools. Pandas module runs on top of NumPy and it is popularly used for data science and data analytics.
Convert PySpark Dataframe to Pandas DataFramePySpark DataFrame provides a method toPandas() to convert it to Python Pandas DataFrame. toPandas() results in the collection of all records in the PySpark DataFrame to the driver program and should be done only on a small subset of the data.
Alright OP, figured this one out. Not exactly sure why this is the case, but it's because of what you named your file. Somehow naming your script dateutil.py
and importing dateutil/pandas is causing a problem.
I got the same error locally until I renamed it. Try renaming your file to something like myfile.py
and your problem should be solved.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With