Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

AttributeError: 'datetime.datetime' object has no attribute 'timestamp'

Tags:

Please Help - I keep receiving the following Traceback Error:

Currently Running Python 2.0

I'm attempting to utilize Python's Plotly library to display an infographic illustrating bitcoin prices. I've tried importing datetime at the top of my code but this doesn't appear to solve the problem.

Traceback (most recent call last):   File "project_one.py", line 165, in <module>     crypto_price_df = get_crypto_data(coinpair)   File "project_one.py", line 155, in get_crypto_data     json_url = base_polo_url.format(poloniex_pair, start_date.timestamp(), end_date.timestamp(), pediod) AttributeError: 'datetime.datetime' object has no attribute 'timestamp' 

My Code Starts Here

import numpy as np import pandas as pd from pandas import Series, DataFrame, Panel import matplotlib.pyplot as plt plt.style.use('fivethirtyeight') import seaborn as sns import sklearn as sk import scipy as sp import os import pickle import quandl import datetime import plotly.plotly as py import plotly.graph_objs as go import plotly.figure_factory as ff from plotly import tools from plotly.offline import iplot, init_notebook_mode from IPython.display import display, HTML init_notebook_mode(connected=True)   def get_quandl_data(quandl_id):      cache_path = '{}.pkl'.format(quandl_id).replace('/','-')     try:         f = open(cache_path, 'rb')         df = pickle.load(f)            print('Loaded {} from cache'.format(quandl_id))     except (OSError, IOError) as e:         print('Downloading {} from Quandl'.format(quandl_id))         df = quandl.get(quandl_id, returns="pandas")         df.to_pickle(cache_path)         print('Cached {} at {}'.format(quandl_id, cache_path))     return df   btc_usd_price_kraken = get_quandl_data('BCHARTS/KRAKENUSD')    exchanges = ['COINBASE','BITSTAMP','ITBIT']  exchange_data = {}  exchange_data['KRAKEN'] = btc_usd_price_kraken  for exchange in exchanges:     exchange_code = 'BCHARTS/{}USD'.format(exchange)     btc_exchange_df = get_quandl_data(exchange_code)     exchange_data[exchange] = btc_exchange_df  def merge_dfs_on_column(dataframes, labels, col):      series_dict = {}     for index in range(len(dataframes)):         series_dict[labels[index]] = dataframes[index][col]      return pd.DataFrame(series_dict)    btc_usd_datasets = merge_dfs_on_column(list(exchange_data.values()),  list(exchange_data.keys()), 'Weighted Price')    def df_scatter(df, title, seperate_y_axis=False, y_axis_label='',  scale='linear', initial_hide=False):      label_arr = list(df)     series_arr = list(map(lambda col: df[col], label_arr))      layout = go.Layout(         title=title,         legend=dict(orientation="h"),         xaxis=dict(type='date'),         yaxis=dict(             title=y_axis_label,             showticklabels= not seperate_y_axis,             type=scale         )     )      y_axis_config = dict(         overlaying='y',         showticklabels=False,         type=scale )      visibility = 'visible'     if initial_hide:         visibility = 'legendonly'       trace_arr = []     for index, series in enumerate(series_arr):         trace = go.Scatter(             x=series.index,              y=series,              name=label_arr[index],             visible=visibility         )           if seperate_y_axis:             trace['yaxis'] = 'y{}'.format(index + 1)             layout['yaxis{}'.format(index + 1)] = y_axis_config             trace_arr.append(trace)      fig = go.Figure(data=trace_arr, layout=layout)     py.plot(fig)    df_scatter(btc_usd_datasets, 'Bitcoin Price (USD) By Exchange')   btc_usd_datasets.replace(0, np.nan, inplace=True)   df_scatter(btc_usd_datasets, 'Bitcoin Price (USD) By Exchange')   btc_usd_datasets['avg_btc_price_usd'] = btc_usd_datasets.mean(axis=1)    btc_trace = go.Scatter(x=btc_usd_datasets.index,  y=btc_usd_datasets['avg_btc_price_usd']) py.plot([btc_trace])    def get_json_data(json_url, cache_path):      try:                 f = open(cache_path, 'rb')         df = pickle.load(f)            print('Loaded {} from cache'.format(json_url))     except (OSError, IOError) as e:         print('Downloading {}'.format(json_url))         df = pd.read_json(json_url)         df.to_pickle(cache_path)         print('Cached {} at {}'.format(json_url, cache_path))     return df  # Helper Function that Generates Poloniex API HTTP requests base_polo_url = 'https://poloniex.com/public?  command=returnChartData&currencyPair={}&start={}&end={}&period={}' start_date = datetime.datetime.strptime('2015-01-01', '%Y-%m-%d') # get  data from the start of 2015 end_date = datetime.datetime.now() # up until today pediod = 86400 # pull daily data (86,400 seconds per day)  def get_crypto_data(poloniex_pair):      json_url = base_polo_url.format(poloniex_pair, start_date.timestamp(), end_date.timestamp(), pediod)     data_df = get_json_data(json_url, poloniex_pair)     data_df = data_df.set_index('date')      return data_df   altcoins = ['ETH','LTC','XRP','ETC','STR','DASH','SC','XMR','XEM'] altcoin_data = {} for altcoin in altcoins:     coinpair = 'BTC_{}'.format(altcoin)     crypto_price_df = get_crypto_data(coinpair)     altcoin_data[altcoin] = crypto_price_df 
like image 393
bullybear17 Avatar asked Jun 01 '18 20:06

bullybear17


1 Answers

The timestamp method was added in Python 3.3. So if you're using Python 2.0, or even 2.7, you don't have it.

There are backports of current datetime to older Python versions on PyPI, but none of them seems to be official, or up-to-date; you might want to try searching for yourself.

There are also a number of third-party replacement libraries that add functionality that isn't in (2.x) datetime, including the ability to convert to Unix timestamps.


You can just copy the function out of the source code from 3.3 or later:

def timestamp(self):     "Return POSIX timestamp as float"     if self._tzinfo is None:         s = self._mktime()         return s + self.microsecond / 1e6     else:         return (self - _EPOCH).total_seconds() 

… but you will have to modify things a bit to get them to work, because:

  • _EPOCH is deleted at the end of the module.
  • The 3.x _EPOCH is a tz-aware object built with a proper UTC timezone, which you don't have in 2.x unless you're using a third-party library like pytz.
  • The _mktime method and _tzinfo attribute don't exist on 2.x datetime, so you need to simulate what they do as well.

If you don't need the same function to work equally well for naive, GMT, and tz-aware datetimes, it won't be that hard, but it's still not quite trivial—and if you do need the full functionality, it's going to be more painful.


Or it may be easier to port the equivalent code given in the docs.

For aware datetime instances:

(dt - datetime(1970, 1, 1, tzinfo=timezone.utc)).total_seconds() 

Of course you still don't have that timezone.utc, but for this purpose, you don't need a full timezone object; you can use an instance of the example UTC class in the 2.x tzinfo docs.

… for naive:

timestamp = dt.replace(tzinfo=timezone.utc).timestamp() 

… or:

timestamp = (dt - datetime(1970, 1, 1)) / timedelta(seconds=1) 

Since you don't have aware datetimes, that last one is all you need.


If your Python is old enough, timedelta may not have a __div__ method. In that case (if you haven't found a backport), you have to do division manually as well, by calling total_seconds on each one, making sure at least one of them is a float, and dividing the numbers:

timestamp = ((dt - datetime(1970, 1, 1)).total_seconds() /      float(timedelta(seconds=1).total_seconds())) 

But in this particular case, it should be pretty obvious that the divisor is just going to be 1.0, and dividing by 1.0 is the same as doing nothing, so:

timestamp = (dt - datetime(1970, 1, 1)).total_seconds() 
like image 195
abarnert Avatar answered Oct 01 '22 18:10

abarnert