I am trying to add a SQLAlchemyJobStore
job store (and make it the default
job store) and store some jobs on it. I am running mysql
which has a database named jobstore
.
I have the following program which tries to open a SQLAlchemyJobStore
job store to the mysql
db that is running:
# sqlalchemy.py
from sqlalchemy import *
from apscheduler.jobstores.sqlalchemy_store import SQLAlchemyJobStore
from apscheduler.scheduler import Scheduler
from datetime import datetime, timedelta
import time
def alarm(time):
print('Alarm! This alarm was scheduled at %s.' % time)
_aps_config = {'standalone': 'True'}
_dbURL = 'mysql://root:<root-password>@localhost/jobstore'
if __name__ == '__main__':
scheduler = Scheduler(_aps_config)
scheduler.add_jobstore(SQLAlchemyJobStore(url=_dbURL), 'default')
alarm_time = datetime.now() + timedelta(seconds=10)
scheduler.add_date_job(alarm, alarm_time, name='alarm1', args=[datetime.now()])
print 'alarms added: ', alarm_time
alarm_time = datetime.now() + timedelta(seconds=15)
scheduler.add_date_job(alarm, alarm_time, name='alarm2', args=[datetime.now()])
print 'alarms added: ', alarm_time
alarm_time = datetime.now() + timedelta(seconds=20)
scheduler.add_date_job(alarm, alarm_time, name='alarm3', args=[datetime.now()])
print 'alarms added: ', alarm_time
try:
scheduler.start()
except (KeyboardInterrupt, SystemExit):
scheduler.shutdown()
pass
When trying to run the above code I see the following:
NameError: global name 'create_engine' is not defined
$ python sqlalchemy.py
Traceback (most recent call last):
File "sqlalchemy.py", line 19, in <module>
scheduler.add_jobstore(SQLAlchemyJobStore(url=_dbURL), 'default')
File "/usr/lib/python2.7/site-packages/APScheduler-2.1.0-py2.7.egg/apscheduler/jobstores/sqlalchemy_store.py", line 29, in __init__
self.engine = create_engine(url)
NameError: global name 'create_engine' is not defined
$
I see "/usr/lib/python2.7/site-packages/APScheduler-2.1.0-py2.7.egg/apscheduler/jobstores/sqlalchemy_store.py"
, the __init__
is trying to create_engine
and its failing.
20 class SQLAlchemyJobStore(JobStore):
21 def __init__(self, url=None, engine=None, tablename='apscheduler_jobs',
22 metadata=None, pickle_protocol=pickle.HIGHEST_PROTOCOL):
23 self.jobs = []
24 self.pickle_protocol = pickle_protocol
25
26 if engine:
27 self.engine = engine
28 elif url:
29 self.engine = create_engine(url)
What is going wrong here?! In other words, how do I create a SQLAlchemyJobStore
using APScheduler
and successfully store the jobs on them? Any example/code-snippet would be a great help!
There seem no problem in your code. I tried to run it and it successfully completes with the following output:
python jobstore.py
alarms added: 2013-02-07 10:31:10.234000
alarms added: 2013-02-07 10:31:15.240000
alarms added: 2013-02-07 10:31:20.240000
The only change I made was updating _dbURL = 'sqlite:///:memory:'
to use sqlite engine.
Please check do you have sqlalchemy installed and it can be found in PYTHONPATH by your script. Run the following code in python console or better add it at the beginning of your script and check the output.
import sqlalchemy
print sqlalchemy.__version__
UPDATE I reread your post and realized that my test code had one more difference - I created file with another name: jobstore.py
I tried to rename file to sqlalchemy.py and got same exception:
Traceback (most recent call last):
File "C:/stackoverflow/so/sqlalchemy.py", line 22, in <module>
scheduler.add_jobstore(SQLAlchemyJobStore(url=_dbURL), 'default')
File "C:\Progs\Python27\lib\site-packages\apscheduler\jobstores\sqlalchemy_store.py", line 29, in __init__
self.engine = create_engine(url)
NameError: global name 'create_engine' is not defined
Process finished with exit code 1
Basically the problem is that your python script name has same name as sqlalchemy module name thus python loads your scripts first and cannot access sqlalchemy code.
Try to rename script name to something other then sqlalchemy.py
- this would help if you have sqlalchemy module installed.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With