Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Best practices for turning jupyter notebooks into python scripts

Jupyter (iPython) notebook is deservedly known as a good tool for prototyping the code and doing all kinds of machine learning stuff interactively. But when I use it, I inevitably run into the following:

  • the notebook quickly becomes too complex and messy to be maintained and improved further as notebook, and I have to make python scripts out of it;
  • when it comes to production code (e.g. one that needs to be re-run every day), the notebook again is not the best format.

Suppose I've developed a whole machine learning pipeline in jupyter that includes fetching raw data from various sources, cleaning the data, feature engineering, and training models after all. Now what's the best logic to make scripts from it with efficient and readable code? I used to tackle it several ways so far:

  1. Simply convert .ipynb to .py and, with only slight changes, hard-code all the pipeline from the notebook into one python script.

    • '+': quick
    • '-': dirty, non-flexible, not convenient to maintain
  2. Make a single script with many functions (approximately, 1 function for each one or two cell), trying to comprise the stages of the pipeline with separate functions, and name them accordingly. Then specify all parameters and global constants via argparse.

    • '+': more flexible usage; more readable code (if you properly transformed the pipeline logic to functions)
    • '-': oftentimes, the pipeline is NOT splittable into logically completed pieces that could become functions without any quirks in the code. All these functions are typically needed to be only called once in the script rather than to be called many times inside loops, maps etc. Furthermore, each function typically takes the output of all functions called before, so one has to pass many arguments to each function.
  3. The same thing as point (2), but now wrap all the functions inside the class. Now all the global constants, as well as outputs of each method can be stored as class attributes.

    • '+': you needn't to pass many arguments to each method -- all the previous outputs already stored as attributes
    • '-': the overall logic of a task is still not captured -- it is data and machine learning pipeline, not just class. The only goal for the class is to be created, call all the methods sequentially one-by-one and then be removed. On top of this, classes are quite long to implement.
  4. Convert a notebook into python module with several scripts. I didn't try this out, but I suspect this is the longest way to deal with the problem.

I suppose, this overall setting is very common among data scientists, but surprisingly I cannot find any useful advice around.

Folks, please, share your ideas and experience. Have you ever encountered this issue? How have you tackled it?

like image 307
kurtosis Avatar asked Aug 24 '15 13:08

kurtosis


People also ask

How do I convert a Jupyter Notebook to a Python script?

Open the jupyter notebook that you want to convert. Navigate into the 'File' menu and select 'Download as'. The more options will be displayed in the form of a list where you will click on the 'Python (. py)' option.

Can Jupyter Notebook be used for Python?

Language of choice. Jupyter supports over 40 programming languages, including Python, R, Julia, and Scala.

Can Jupyter notebooks be used in production?

Notebooks are great tools for working with data, especially when leveraging open-source tools like papermill, airflow, or nbdev. Jupyter allows us to reliably execute notebooks in the production system.

How do I convert a Jupyter notebook to Python?

Export the Jupyter Notebook to Python file (.py) through the GUI. Remove the "helper" lines that don't do the actual work: print statements, plots, etc. If need be, bundle your logic into classes. The only extra refactoring work required should be to write your class docstrings and attributes.

What are the best practices for Jupyter Notebook?

This is essential. Jupyter Notebook is just a new development environment for writing code. All the best practices of software development should still apply: Version control and code review systems (e.g. git, mercurial). Separate environments: split production and development artifacts.

What are the benefits of using scripts in Jupyter Notebook?

These are the benefits I found when using scripts: The cells in Jupyter Notebook make it difficult to organize the code into different parts. With a script, we could create several small functions with each function specifies what the code does like this

How do I validate changes to a Jupyter Notebook?

Each change to a Jupyter notebook should be validated by a continuous integration system before being checked in; this can be done using different setups (non-master remote branch, remote execution in local branch, etc) In this demo, we modified a notebook so that it contains invalid Python code, and then we commit the results to git.


2 Answers

Life saver: as you're writing your notebooks, incrementally refactor your code into functions, writing some minimal assert tests and docstrings.

After that, refactoring from notebook to script is natural. Not only that, but it makes your life easier when writing long notebooks, even if you have no plans to turn them into anything else.

Basic example of a cell's content with "minimal" tests and docstrings:

def zip_count(f):     """Given zip filename, returns number of files inside.      str -> int"""     from contextlib import closing     with closing(zipfile.ZipFile(f)) as archive:         num_files = len(archive.infolist())     return num_files  zip_filename = 'data/myfile.zip'  # Make sure `myfile` always has three files assert zip_count(zip_filename) == 3 # And total zip size is under 2 MB assert os.path.getsize(zip_filename) / 1024**2 < 2  print(zip_count(zip_filename)) 

Once you've exported it to bare .py files, your code will probably not be structured into classes yet. But it is worth the effort to have refactored your notebook to the point where it has a set of documented functions, each with a set of simple assert statements that can easily be moved into tests.py for testing with pytest, unittest, or what have you. If it makes sense, bundling these functions into methods for your classes is dead-easy after that.

If all goes well, all you need to do after that is to write your if __name__ == '__main__': and its "hooks": if you're writing script to be called by the terminal you'll want to handle command-line arguments, if you're writing a module you'll want to think about its API with the __init__.py file, etc.

It all depends on what the intended use case is, of course: there's quite a difference between converting a notebook to a small script vs. turning it into a full-fledged module or package.

Here's a few ideas for a notebook-to-script workflow:

  1. Export the Jupyter Notebook to Python file (.py) through the GUI.
  2. Remove the "helper" lines that don't do the actual work: print statements, plots, etc.
  3. If need be, bundle your logic into classes. The only extra refactoring work required should be to write your class docstrings and attributes.
  4. Write your script's entryways with if __name__ == '__main__'.
  5. Separate your assert statements for each of your functions/methods, and flesh out a minimal test suite in tests.py.
like image 87
François Leblanc Avatar answered Oct 13 '22 01:10

François Leblanc


We are having the similar issue. However we are using several notebooks for prototyping the outcomes which should become also several python scripts after all.

Our approach is that we put aside the code, which seams to repeat across those notebooks. We put it into the python module, which is imported by each notebook and also used in the production. We iteratively improve this module continuously and add tests of what we find during prototyping.

Notebooks then become rather like the configuration scripts (which we just plainly copy into the end resulting python files) and several prototyping checks and validations, which we do not need in the production.

Most of all we are not afraid of the refactoring :)

like image 38
Radek Avatar answered Oct 13 '22 00:10

Radek