https://github.com/tensorflow/tensorflow/blob/r2.0/tensorflow/python/keras/activations.py
linear(x)
@keras_export('keras.activations.linear')
def linear(x):
"""Linear activation function.
Arguments:
x: Input tensor.
Returns:
The linear activation: `x`.
"""
return x
sigmoid(x)
f ( z ) = 1 1 + e − z . f(z) = frac{1}{1 + e^{-z}} . f(z)=1+e−z1.
@keras_export('keras.activations.sigmoid')
def sigmoid(x):
"""Sigmoid.
Applies the sigmoid activation function. The sigmoid function is defined as
1 divided by (1 + exp(-x)). It's curve is like an "S" and is like a smoothed
version of the Heaviside (Unit Step Function) function. For small values
(<-5) the sigmoid returns a value close to zero and for larger values (>5)
the result of the function gets close to 1.
Arguments:
x: A tensor or variable.
Returns:
A tensor.
Sigmoid activation function.
Arguments:
x: Input tensor.
Returns:
The sigmoid activation: `(1.0 / (1.0 + exp(-x)))`.
"""
return nn.sigmoid(x)
最后
以上就是可耐发卡最近收集整理的关于tensorflow的激活函数的全部内容,更多相关tensorflow内容请搜索靠谱客的其他文章。
发表评论 取消回复