I'm trying to install cl-jupyter (A common-lisp kernel for Jupyter), and I can't make it work: when I open a new lisp notebook (or change the kernel of an existing notebook), it crashes after displaying the following messages:
[I 18:26:50.855 NotebookApp] Kernel started: ccba815a-9065-4fad-9d95-06f6291136d2
To load "cl-jupyter":
Load 1 ASDF system:
cl-jupyter
; Loading "cl-jupyter"
...............
cl-jupyter: an enhanced interactive Common Lisp REPL
(Version 0.7 - Jupyter protocol v.5.0)
--> (C) 2014-2015 Frederic Peschanski (cf. LICENSE)
kernel configuration = ((hb_port . 58864) (shell_port . 37462)
(transport . tcp) (iopub_port . 43232)
(signature_scheme . hmac-sha256) (control_port . 52184)
(stdin_port . 45879)
(key . 2ae7d65f-65f9-40d8-bfd4-21760eaec0ca)
(ip . 127.0.0.1))
[Hearbeat] starting...
[Heartbeat] thread started
[Heartbeat] thread started
[Kernel] Entering mainloop ...
[Shell] loop started
Argh! corrupted error depth, halting
fatal error encountered in SBCL pid 24605(tid 140737353922304):
%PRIMITIVE HALT called; the party is over.
Jupyter
works fine with python (both 2x and 3x).sbcl
also seems to work fine (I'm just starting with common-lisp, so I might be missing something - but I can run code, and use asdf and quicklisp).python3 ./install-cl-jupyter.py
"..sbclrc
file now contains just "(load "~/quicklisp/setup.lisp")
".The file ~/.ipython/kernels/lisp/kernel.json
contains:
{"argv": ["sbcl", "--non-interactive", "--load", "/home/myusername/lisp/systems/cl-jupyter/cl-jupyter.lisp", "/homemyusername/lisp/systems/cl-jupyter/src", "/home/myusername/lisp/systems/cl-jupyter", "{connection_file}"], "display_name": "SBCL Lisp", "language": "lisp"}
All the paths are correct. When I run the same code manually:
sbcl --non-interactive --load "/home/myusername/lisp/systems/cl-jupyter/cl-jupyter.lisp" "/homemyusername/lisp/systems/cl-jupyter/src" "/home/myusername/lisp/systems/cl-jupyter" kernel-07e04903-c562-4c67-bcc1-b68f4047d8d2.json
(where "kernel-07e04903-c562-4c67-bcc1-b68f4047d8d2.json" is an auto-generated file for a lisp-kernel that I saved), it "looks ok" (e.g. it runs, and waits with no errors).
I tried to do some quick debug-printing, and my best guess is that it crashes in src/shell.lisp
, in the line
(vbinds (identities sig msg buffers) (message-recv (shell-socket shell))
But again, I could be wrong. Since it kinda looks like it has something to do with communication, I made sure that zeromq is updated.
I'm not sure what's next besides learning some more lisp, and then come back to properly debug the code - but I was hoping to learn it using Jupyter :)
I'd appreciate any suggestions, Thanks.
The name, Jupyter, comes from the core supported programming languages that it supports: Julia, Python, and R. Jupyter ships with the IPython kernel, which allows you to write your programs in Python, but there are currently over 100 other kernels that you can also use.
Language of choiceJupyter supports over 40 programming languages, including Python, R, Julia, and Scala.
Jupyter Notebook Markdown allows you to use raw HTML in Markdown cells.
This issue is caused by a change in the size of the message structure, zmq_msg_t
, in 0MQ version 4.1.x to 64 bytes from a size of 32 bytes in version 4.0.x.
The 0MQ interface library used by cl-jupyter
is pzmq
, and that library is stated to support 0MQ up to version 4.0. At the time of this writing, pzmq
uses a 0MQ message size of 32 bytes. This leads to internal errors if 0MQ version 4.1.x is installed.
There are two solutions here:
pzmq
library by updating the definition of msg%
in c-api.lisp
from 32 to 64 bytes.
Update - 9 May 2016
The pzmq
library was updated to support 0MQ version 4.1.x, (preserving compatibility with older versions). The new version of pzmq
is planned for release as part of the May 2016 Quicklisp release.
Update - 1 June 2016
The May 2016 Quicklisp release includes updates to the pzmq
library which resolve this issue. A downgrade of 0MQ or modifications to the pzmq
library are no longer necessary.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With