I am loading a csv file into a Pandas DataFrame. For each column, how do I specify what type of data it contains using the dtype
argument?
np.bool_
and pd.tslib.Timestamp
without luck.Code:
import pandas as pd
import numpy as np
df = pd.read_csv(<file-name>, dtype={'A': np.int64, 'B': np.float64})
The default value of the sep parameter is the comma (,) which means if we don't specify the sep parameter in our read_csv() function, it is understood that our file is using comma as the delimiter.
nrows : This parameter allows you to control how many rows you want to load from the CSV file. It takes an integer specifying row count. B. skiprows : This parameter allows you to skip rows from the beginning of the file.
In this case, the Pandas read_csv() function returns a new DataFrame with the data and labels from the file data. csv , which you specified with the first argument. This string can be any valid path, including URLs.
Use DataFrame. loc[] and DataFrame. iloc[] to select a single column or multiple columns from pandas DataFrame by column names/label or index position respectively.
There are a lot of options for read_csv which will handle all the cases you mentioned. You might want to try dtype={'A': datetime.datetime}, but often you won't need dtypes as pandas can infer the types.
For dates, then you need to specify the parse_date options:
parse_dates : boolean, list of ints or names, list of lists, or dict
keep_date_col : boolean, default False
date_parser : function
In general for converting boolean values you will need to specify:
true_values : list Values to consider as True
false_values : list Values to consider as False
Which will transform any value in the list to the boolean true/false. For more general conversions you will most likely need
converters : dict. optional Dict of functions for converting values in certain columns. Keys can either be integers or column labels
Though dense, check here for the full list: http://pandas.pydata.org/pandas-docs/stable/generated/pandas.io.parsers.read_csv.html
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With