Reference: https://www.tensorflow.org/versions/r0.10/api_docs/python/nn/activation_functions_

1. Smooth nonlinearities:
(tf.nn.)sigmoid, tanh, elu, softplus, softsign

2. Continuous but not everywhere differentiable:
(tf.nn.)relu, relu6

3. Random regularization:
(tf.nn.)dropout

e.g. for tf.contrib.learn.DNNClassifier, the default activation function in __init__ is relu:

__init__( ..., activation_fn=tf.nn.relu, ...)

Reference: https://www.tensorflow.org/get_started/get_started

1. Multiple level APIs:
(1) lowest level: TensorFlow Core, if you need fine controls on the model
(2) higher levels: built on Core, easier to use, e.g. tf.contrib.learn

2. Tensor: an array of primitive values. Rank: how many dimensions. Shape: a vector, containing the numbers of elements in each dimension.
e.g. [[[1., 2., 3.]], [[7., 8., 9.]]] has rank 3, shape [2,1,3]

3. Import library

import tensorflow as tf

Continue reading