Reference: https://www.tensorflow.org/versions/r0.10/api_docs/python/nn/activation_functions_
1. Smooth nonlinearities:
(tf.nn.)sigmoid, tanh, elu, softplus, softsign
2. Continuous but not everywhere differentiable:
(tf.nn.)relu, relu6
3. Random regularization:
(tf.nn.)dropout
e.g. for tf.contrib.learn.DNNClassifier, the default activation function in __init__ is relu:
__init__( ..., activation_fn=tf.nn.relu, ...)