Define Models for SQLFlow

SQLFlow enables SQL programs to call deep learning models defined in Python. This document is about how to define models for SQLFlow.

Keras v.s. Estimator

Many deep learners define models using Keras API or as an Estimator derived class. We prefer Keras over Estimator for some reasons:

  1. TensorFlow Dev Summit 2019 announced that TensorFlow 2.x will closely integrate with Keras.

  2. We found more documents about Keras than Estimator.

  3. We found more models defined using Keras than Estimator.

Keras APIs

Keras provides three approaches to define models.

1. Subclassing tf.keras.Model

  1. class DNNClassifier(tf.keras.Model):
  2. def __init__(self, feature_columns, hidden_units, n_classes):
  3. super(DNNClassifier, self).__init__()
  4. self.feature_layer = tf.keras.layers.DenseFeatures(feature_columns)
  5. self.hidden_layers = []
  6. for hidden_unit in hidden_units:
  7. self.hidden_layers.append(tf.keras.layers.Dense(hidden_unit))
  8. self.prediction_layer = tf.keras.layers.Dense(n_classes, activation='softmax')
  9. def call(self, inputs):
  10. x = self.feature_layer(inputs)
  11. for hidden_layer in self.hidden_layers:
  12. x = hidden_layer(x)
  13. return self.prediction_layer(x)
  14. model = DNNClassifier(feature_columns, hidden_units, n_classes)

Please be aware that tf.keras.Model has methods save_weights and load_weights, which save/load model parameters but no the topology, as explained in this guidance and this example list.

2. Functional API

  1. x = tf.feature_column.input_layer(shape=(5,))
  2. for n in hidden_units:
  3. x = tf.keras.layers.Dense(n, activation='relu')(x)
  4. pred = tf.keras.layers.Dense(n_classes, activation='softmax')(x)
  5. model = tf.keras.models.Model(inputs=feature_columns, outputs=pred)

The functional API can work with feature column API only by assigning tf.keras.Input to each original feature column. See thislink for an example.

3. keras.Sequential

  1. model = tf.keras.Sequential()
  2. model.add(tf.keras.layers.DenseFeatures(feature_columns))
  3. for n in hidden_units:
  4. model.add(tf.keras.layers.Dense(n, activation='relu'))
  5. model.add(tf.keras.layers.Dense(n_classes, activation='softmax'))

Please be aware that tf.keras.Sequential() only covers a small variety of models. It doesn’t cover many well-known models including ResNet, Transforms, and WideAndDeep.

The Choice

We chose the approach of subclassing tf.keras.Model according to the following table.

Keras APIsWork with feature column APISave/load modelsModel coverage
tf.keras.Model☑️weights-only, no topologyHigh
Functional API☑️☑️High
Sequential Model☑️☑️Low

A Subclass model Example

Here is an example DNNClassifier of multiple hidden layers as a Python class derived from tf.keras.Model. To run it, please use TensorFlow 2.0 alpha or newer versions.

  1. class DNNClassifier(tf.keras.Model):
  2. def __init__(self, feature_columns, hidden_units, n_classes):
  3. """DNNClassifier
  4. :param feature_columns: feature columns.
  5. :type feature_columns: list[tf.feature_column].
  6. :param hidden_units: number of hidden units.
  7. :type hidden_units: list[int].
  8. :param n_classes: List of hidden units per layer.
  9. :type n_classes: int.
  10. """
  11. super(DNNClassifier, self).__init__()
  12. # combines all the data as a dense tensor
  13. self.feature_layer = tf.keras.layers.DenseFeatures(feature_columns)
  14. self.hidden_layers = []
  15. for hidden_unit in hidden_units:
  16. self.hidden_layers.append(tf.keras.layers.Dense(hidden_unit))
  17. self.prediction_layer = tf.keras.layers.Dense(n_classes, activation='softmax')
  18. def call(self, inputs):
  19. x = self.feature_layer(inputs)
  20. for hidden_layer in self.hidden_layers:
  21. x = hidden_layer(x)
  22. return self.prediction_layer(x)
  23. def default_optimizer(self):
  24. """Default optimizer name. Used in model.compile."""
  25. return 'adam'
  26. def default_loss(self):
  27. """Default loss function. Used in model.compile."""
  28. return 'categorical_crossentropy'
  29. def default_training_epochs(self):
  30. """Default training epochs. Used in model.fit."""
  31. return 5
  32. def prepare_prediction_column(self, prediction):
  33. """Return the class label of highest probability."""
  34. return prediction.argmax(axis=-1)

A Functional API model Example

  1. def MyExampleModel(feature_columns, field_metas, learning_rate=0.01):
  2. feature_layer_inputs = dict()
  3. for fm in field_metas:
  4. feature_layer_inputs[fm.name] = tf.keras.Input(shape=(fm.shape), name=fm.name, dtype=fm.dtype)
  5. feature_layer = tf.keras.layers.DenseFeatures(feature_columns)
  6. feature_layer_outputs = feature_layer(feature_layer_inputs)
  7. x = layers.Dense(128, activation='relu')(feature_layer_outputs)
  8. x = layers.Dense(64, activation='relu')(x)
  9. pred = layers.Dense(1, activation='sigmoid')(x)
  10. return keras.Model(inputs=[v for v in feature_layer_inputs.values()], outputs=pred)
  11. def loss(output, labels):
  12. labels = tf.reshape(labels, [-1])
  13. return tf.reduce_mean(
  14. input_tensor=tf.nn.sparse_softmax_cross_entropy_with_logits(
  15. logits=output, labels=labels
  16. )
  17. )
  18. def optimizer(lr=0.1):
  19. return tf.optimizers.SGD(lr)
  20. def prepare_prediction_column(self, prediction):
  21. """Return the class label of highest probability."""
  22. return prediction.argmax(axis=-1)

Further Reading

We read the following Keras source code files: models.py, network.py, and training.py.