Configuration Settings and Compiling Modes
Configuration
The config
module contains several attributes that modify Theano’s behavior. Many of theseattributes are examined during the import of the theano
module and several are assumed to beread-only.
As a rule, the attributes in the config
module should not be modified inside the user code.
Theano’s code comes with default values for these attributes, but you canoverride them from your .theanorc
file, and override those values in turn bythe THEANO_FLAGS
environment variable.
The order of precedence is:
- an assignment to theano.config.
- an assignment in
THEANO_FLAGS
- an assignment in the .theanorc file (or the file indicated in
THEANORC
) You can display the current/effective configuration at any time by printingtheano.config. For example, to see a list of all active configurationvariables, type this from the command-line:
- python -c 'import theano; print(theano.config)' | less
For more detail, see Configuration in the library.
Exercise
Consider the logistic regression:
- import numpy
- import theano
- import theano.tensor as T
- rng = numpy.random
- N = 400
- feats = 784
- D = (rng.randn(N, feats).astype(theano.config.floatX),
- rng.randint(size=N,low=0, high=2).astype(theano.config.floatX))
- training_steps = 10000
- # Declare Theano symbolic variables
- x = T.matrix("x")
- y = T.vector("y")
- w = theano.shared(rng.randn(feats).astype(theano.config.floatX), name="w")
- b = theano.shared(numpy.asarray(0., dtype=theano.config.floatX), name="b")
- x.tag.test_value = D[0]
- y.tag.test_value = D[1]
- # Construct Theano expression graph
- p_1 = 1 / (1 + T.exp(-T.dot(x, w)-b)) # Probability of having a one
- prediction = p_1 > 0.5 # The prediction that is done: 0 or 1
- xent = -y*T.log(p_1) - (1-y)*T.log(1-p_1) # Cross-entropy
- cost = xent.mean() + 0.01*(w**2).sum() # The cost to optimize
- gw,gb = T.grad(cost, [w,b])
- # Compile expressions to functions
- train = theano.function(
- inputs=[x,y],
- outputs=[prediction, xent],
- updates=[(w, w-0.01*gw), (b, b-0.01*gb)],
- name = "train")
- predict = theano.function(inputs=[x], outputs=prediction,
- name = "predict")
- if any([x.op.__class__.__name__ in ['Gemv', 'CGemv', 'Gemm', 'CGemm'] for x in
- train.maker.fgraph.toposort()]):
- print('Used the cpu')
- elif any([x.op.__class__.__name__ in ['GpuGemm', 'GpuGemv'] for x in
- train.maker.fgraph.toposort()]):
- print('Used the gpu')
- else:
- print('ERROR, not able to tell if theano used the cpu or the gpu')
- print(train.maker.fgraph.toposort())
- for i in range(training_steps):
- pred, err = train(D[0], D[1])
- print("target values for D")
- print(D[1])
- print("prediction on D")
- print(predict(D[0]))
Modify and execute this example to run on CPU (the default) with floatX=float32 andtime the execution using the command line time python file.py
. Save your codeas it will be useful later on.
Note
- Apply the Theano flag
floatX=float32
(throughtheano.config.floatX
) in your code. - Cast inputs before storing them into a shared variable.
- Circumvent the automatic cast of int32 with float32 to float64:
- Insert manual cast in your code or use [u]int{8,16}.
- Insert manual cast around the mean operator (this involves division by length, which is an int64).
- Note that a new casting mechanism is being developed.
Mode
Every time theano.function
is called,the symbolic relationships between the input and output Theano _variables_are optimized and compiled. The way this compilation occursis controlled by the value of the mode
parameter.
Theano defines the following modes by name:
'FAST_COMPILE'
: Apply just a few graph optimizations and only use Python implementations. So GPU is disabled.'FAST_RUN'
: Apply all optimizations and use C implementations where possible.'DebugMode'
: Verify the correctness of all optimizations, and compare C and Python- implementations. This mode can take much longer than the other modes, but can identifyseveral kinds of problems.
'NanGuardMode'
: Same optimization as FAST_RUN, but check if a node generate nans.
The default mode is typically FAST_RUN
, but it can be controlled viathe configuration variable config.mode
,which can be overridden by passing the keyword argument totheano.function
.
short name | Full constructor | What does it do? |
---|---|---|
FAST_COMPILE | compile.mode.Mode(linker='py', optimizer='fast_compile') | Python implementations only, quick and cheap graph transformations |
FAST_RUN | compile.mode.Mode(linker='cvm', optimizer='fast_run') | C implementations where available, all available graph transformations. |
DebugMode | compile.debugmode.DebugMode() | Both implementations where available, all available graph transformations. |
Note
For debugging purpose, there also exists a MonitorMode
(which has noshort name). It can be used to step through the execution of a function:see the debugging FAQ for details.
Linkers
A mode is composed of 2 things: an optimizer and a linker. Some modes,like NanGuardMode
and DebugMode
, add logic around the optimizer andlinker. NanGuardMode
and DebugMode
use their own linker.
You can select which linker to use with the Theano flag config.linker
.Here is a table to compare the different linkers.
linker | gc [1] | Raise error by op | Overhead | Definition |
---|---|---|---|---|
cvm | yes | yes | “++” | As c|py, but the runtime algo to execute the code is in c |
cvm_nogc | no | yes | “+” | As cvm, but without gc |
c|py [2] | yes | yes | “+++” | Try C code. If none exists for an op, use Python |
c|py_nogc | no | yes | “++” | As c|py, but without gc |
c | no | yes | “+” | Use only C code (if none available for an op, raise an error) |
py | yes | yes | “+++” | Use only Python code |
NanGuardMode | no | no | “++++” | Check if nodes generate NaN |
DebugMode | no | yes | VERY HIGH | Make many checks on what Theano computes |
[1] | Garbage collection of intermediate results during computation.Otherwise, their memory space used by the ops is kept betweenTheano function calls, in order not toreallocate memory, and lower the overhead (make it faster…). |
[2] | Default |
For more detail, see Mode in the library.
Using DebugMode
While normally you should use the FAST_RUN
or FAST_COMPILE
mode,it is useful at first (especially when you are defining new kinds ofexpressions or new optimizations) to run your code using the DebugMode(available via mode='DebugMode
). The DebugMode is designed torun several self-checks and assertions that can help diagnosepossible programming errors leading to incorrect output. Note thatDebugMode
is much slower than FAST_RUN
or FAST_COMPILE
souse it only during development (not when you launch 1000 processes on acluster!).
DebugMode is used as follows:
- x = T.dvector('x')
- f = theano.function([x], 10 * x, mode='DebugMode')
- f([5])
- f([0])
- f([7])
If any problem is detected, DebugMode will raise an exception according towhat went wrong, either at call time (f(5)) or compile time (f = theano.function(x, 10 * x, mode='DebugMode')
). These exceptionsshould not be ignored; talk to your local Theano guru or email theusers list if you cannot make the exception go away.
Some kinds of errors can only be detected for certain input value combinations.In the example above, there is no way to guarantee that a future call to, sayf(-1), won’t cause a problem. DebugMode is not a silver bullet.
If you instantiate DebugMode using the constructor (see DebugMode
)rather than the keyword DebugMode
you can configure its behaviour viaconstructor arguments. The keyword version of DebugMode (which you get by using mode='DebugMode'
)is quite strict.
For more detail, see DebugMode in the library.
ProfileMode
Note
ProfileMode is deprecated. Use config.profile
instead.