Skip to main content
Skip table of contents

How To: Resolve browser crash from large notebook output

Overview

If you are executing a notebook cell(s) that outputs large amounts of data such as dataframe rows or print statements, you may receive a browser error that the page has run out of memory or is unresponsive. If the output has been saved to the notebook, then simply opening the notebook may cause the browser crash once again. Once more, if the Data Lab project’s layout has been saved such that it includes the troublesome notebook, you may experience the browser crash whenever the project is opened, as it is attempting to load the notebook

Solution

If opening the Data Lab project in a new broswer tab causes the crash again, then you will need to open the project with the JupyterLab’s workspace reset to the default, which just shows the Launcher tab. To do so, append ?reset to the Data Lab project url, such as:

CODE
https://<seeq_site>/data-lab/<project_id>/lab?reset

Once you are able to access the Data Lab project, the cell output must be purged and the notebook saved. To do so, open a terminal by clicking Terminal from the Launcher tab. In the terminal execute the following command and replace my_notebook.ipynb with the path and name to your notebook:

CODE
jupyter nbconvert --ClearOutputPreprocessor.enabled=True --inplace my_notebook.ipynb

You can now open the notebook. Be sure to modify your code so that it does not output large data before executing the notebook cells again.

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.