[source]

TimeDistributed

  1. keras.layers.TimeDistributed(layer)

This wrapper applies a layer to every temporal slice of an input.

The input should be at least 3D, and the dimension of index onewill be considered to be the temporal dimension.

Consider a batch of 32 samples,where each sample is a sequence of 10 vectors of 16 dimensions.The batch input shape of the layer is then (32, 10, 16),and the input_shape, not including the samples dimension, is (10, 16).

You can then use TimeDistributed to apply a Dense layerto each of the 10 timesteps, independently:

  1. # as the first layer in a model
  2. model = Sequential()
  3. model.add(TimeDistributed(Dense(8), input_shape=(10, 16)))
  4. # now model.output_shape == (None, 10, 8)

The output will then have shape (32, 10, 8).

In subsequent layers, there is no need for the input_shape:

  1. model.add(TimeDistributed(Dense(32)))
  2. # now model.output_shape == (None, 10, 32)

The output will then have shape (32, 10, 32).

TimeDistributed can be used with arbitrary layers, not just Dense,for instance with a Conv2D layer:

  1. model = Sequential()
  2. model.add(TimeDistributed(Conv2D(64, (3, 3)),
  3. input_shape=(10, 299, 299, 3)))

Arguments

  • layer: a layer instance.

[source]

Bidirectional

  1. keras.engine.base_layer.wrapped_fn()

Bidirectional wrapper for RNNs.

Arguments

  • layer: Recurrent instance.
  • merge_mode: Mode by which outputs of the forward and backward RNNs will be combined. One of {'sum', 'mul', 'concat', 'ave', None}. If None, the outputs will not be combined, they will be returned as a list.
  • weights: Initial weights to load in the Bidirectional model

Raises

  • ValueError: In case of invalid merge_mode argument.

Examples

  1. model = Sequential()
  2. model.add(Bidirectional(LSTM(10, return_sequences=True),
  3. input_shape=(5, 10)))
  4. model.add(Bidirectional(LSTM(10)))
  5. model.add(Dense(5))
  6. model.add(Activation('softmax'))
  7. model.compile(loss='categorical_crossentropy', optimizer='rmsprop')