utils.ipython
IPython Utilities
Utilities to help work with ipython/jupyter environment.
To import from fastai.utils.ipython
do:
from fastai.utils.ipython import *
Workarounds to the leaky ipython traceback on exception
ipython has a feature where it stores tb with all the locals()
tied in, which prevents gc.collect()
from freeing those variables and leading to a leakage.
Therefore we cleanse the tb before handing it over to ipython. The 2 ways of doing it are by either using the gpu_mem_restore
decorator or the gpu_mem_restore_ctx
context manager which are described next:
gpu_mem_restore
[source][test]
gpu_mem_restore
(func
) No tests found forgpu_mem_restore
. To contribute a test please refer to this guide and this discussion.
Reclaim GPU RAM if CUDA out of memory happened, or execution was interrupted
gpu_mem_restore
is a decorator to be used with any functions that interact with CUDA (top-level is fine)
- under non-ipython environment it doesn’t do anything.
- under ipython currently it strips tb by default only for the “CUDA out of memory” exception.
The env var FASTAI_TB_CLEAR_FRAMES
changes this behavior when run under ipython, depending on its value:
- “0”: never strip tb (makes it possible to always use
%debug
magic, but with leaks) - “1”: always strip tb (never need to worry about leaks, but
%debug
won’t work)
e.g. os.environ['FASTAI_TB_CLEAR_FRAMES']="0"
will set it to 0.
class
gpu_mem_restore_ctx
[source][test]
gpu_mem_restore_ctx
() No tests found forgpu_mem_restore_ctx
. To contribute a test please refer to this guide and this discussion.
context manager to reclaim RAM if an exception happened under ipython
if function decorator is not a good option, you can use a context manager instead. For example:
with gpu_mem_restore_ctx():
learn.fit_one_cycle(1,1e-2)
This particular one will clear tb on any exception.
©2021 fast.ai. All rights reserved.
Site last generated: Jan 5, 2021