I've started getting this error on the production environment this morning, after getting Django-storages, Boto, and Django-compressor working to put static files on S3 yesterday, though I don't know if that is related...
OperationalError: could not fork new process for connection: Cannot allocate memory
could not fork new process for connection: Cannot allocate memory
could not fork new process for connection: Cannot allocate memory
Stacktrace (most recent call last):
File "django/core/handlers/base.py", line 89, in get_response
response = middleware_method(request)
File "reversion/middleware.py", line 17, in process_request
if hasattr(request, "user") and request.user.is_authenticated():
File "django/utils/functional.py", line 184, in inner
self._setup()
File "django/utils/functional.py", line 248, in _setup
self._wrapped = self._setupfunc()
File "django/contrib/auth/middleware.py", line 16, in <lambda>
request.user = SimpleLazyObject(lambda: get_user(request))
File "django/contrib/auth/middleware.py", line 8, in get_user
request._cached_user = auth.get_user(request)
File "django/contrib/auth/__init__.py", line 98, in get_user
user_id = request.session[SESSION_KEY]
File "django/contrib/sessions/backends/base.py", line 39, in __getitem__
return self._session[key]
File "django/contrib/sessions/backends/base.py", line 165, in _get_session
self._session_cache = self.load()
File "django/contrib/sessions/backends/db.py", line 19, in load
expire_date__gt=timezone.now()
File "django/db/models/manager.py", line 131, in get
return self.get_query_set().get(*args, **kwargs)
File "django/db/models/query.py", line 361, in get
num = len(clone)
File "django/db/models/query.py", line 85, in __len__
self._result_cache = list(self.iterator())
File "django/db/models/query.py", line 291, in iterator
for row in compiler.results_iter():
File "django/db/models/sql/compiler.py", line 763, in results_iter
for rows in self.execute_sql(MULTI):
File "django/db/models/sql/compiler.py", line 817, in execute_sql
cursor = self.connection.cursor()
File "django/db/backends/__init__.py", line 308, in cursor
cursor = util.CursorWrapper(self._cursor(), self)
File "django/db/backends/postgresql_psycopg2/base.py", line 177, in _cursor
self.connection = Database.connect(**conn_params)
File "psycopg2/__init__.py", line 178, in connect
return _connect(dsn, connection_factory=connection_factory, async=async)
I am deploying the site on Heroku. It works for a bit after I restart the application, but stops working again after a few minutes.
Any ideas as to what might be causing this?
I encountered the same problem trying to set up a simple django web application with a postgresql database on heroku and managed to solve it.
I don't fully understand the error but the fix is fairly simple: when you are passing python lists created by queries to your database, you need to limit the size of the list.
So for example if you are passing as context the following list:
set_list = userSetTable.objects.all()
return render(request, 'fc/user.html', {'set_list': set_list,})
That will cause an error because set_list might be really big. You need to specify a maximum size:
set_list = userSetTable.objects.all()[0:20]
So in a real-world application, you might want to display the list as page results or whatever... you get the point.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With