Delayed
Sometimes problems don’t fit into one of the collections like dask.array
ordask.dataframe
. In these cases, users can parallelize custom algorithmsusing the simpler dask.delayed
interface. This allows one to create graphsdirectly with a light annotation of normal python code:
- >>> x = dask.delayed(inc)(1)
- >>> y = dask.delayed(inc)(2)
- >>> z = dask.delayed(add)(x, y)
- >>> z.compute()
- 5
- >>> z.visualize()
Example
Visit https://examples.dask.org/delayed.html to see and run examples using DaskDelayed.
Sometimes we face problems that are parallelizable, but don’t fit into high-levelabstractions like Dask Array or Dask DataFrame. Consider the following example:
- def inc(x):
- return x + 1
- def double(x):
- return x + 2
- def add(x, y):
- return x + y
- data = [1, 2, 3, 4, 5]
- output = []
- for x in data:
- a = inc(x)
- b = double(x)
- c = add(a, b)
- output.append(c)
- total = sum(output)
There is clearly parallelism in this problem (many of the inc
,double
, and add
functions can evaluate independently), but it’s notclear how to convert this to a big array or big DataFrame computation.
As written, this code runs sequentially in a single thread. However, we see thata lot of this could be executed in parallel.
The Dask delayed
function decorates your functions so that they operatelazily. Rather than executing your function immediately, it will deferexecution, placing the function and its arguments into a task graph.
delayed ([obj, name, pure, nout, traverse]) | Wraps a function or object to produce a Delayed . |
We slightly modify our code by wrapping functions in delayed
.This delays the execution of the function and generates a Dask graph instead:
- import dask
- output = []
- for x in data:
- a = dask.delayed(inc)(x)
- b = dask.delayed(double)(x)
- c = dask.delayed(add)(a, b)
- output.append(c)
- total = dask.delayed(sum)(output)
We used the dask.delayed
function to wrap the function calls that we wantto turn into tasks. None of the inc
, double
, add
, or sum
callshave happened yet. Instead, the object total
is a Delayed
result thatcontains a task graph of the entire computation. Looking at the graph we seeclear opportunities for parallel execution. The Dask schedulers will exploitthis parallelism, generally improving performance (although not in thisexample, because these functions are already very small and fast.)
- total.visualize() # see image to the right
We can now compute this lazy result to execute the graph in parallel:
- >>> total.compute()
- 45
Decorator
It is also common to see the delayed function used as a decorator. Here is areproduction of our original problem as a parallel code:
- import dask
- @dask.delayed
- def inc(x):
- return x + 1
- @dask.delayed
- def double(x):
- return x + 2
- @dask.delayed
- def add(x, y):
- return x + y
- data = [1, 2, 3, 4, 5]
- output = []
- for x in data:
- a = inc(x)
- b = double(x)
- c = add(a, b)
- output.append(c)
- total = dask.delayed(sum)(output)
Real time
Sometimes you want to create and destroy work during execution, launch tasksfrom other tasks, etc. For this, see the Futures interface.
Best Practices
For a list of common problems and recommendations see Delayed BestPractices.