tf.layers.ActivationLayer module
tf.layers.ActivationLayer module¶
- class tf.layers.ActivationLayer.ActivationLayer(*args: Any, **kwargs: Any)¶
Bases:
modelzoo.common.tf.layers.BaseLayer.BaseLayer
Wrapper around the Keras activation layer.
Also supports
activation="GeLU"
,activation="relu6"
andactivation="hard_swish"
which are currently missing inkeras.layers.ActivationLayer
v2.2.- Parameters
activation (Union[str, Callable]) – The function to be applied. This can either be callable string name of a Tensorflow built-in activation, or one of
"gelu"
,"lrelu"
(lrelu
denotes LeakyReLU),"relu6"
or"hard_swish"
.boundary_casting (bool) – If
True
, outputs the values in half precision and casts the input values up to full precision.tf_summary (bool) – If
True
, saves the activations with thesummary_layer
.
- call(inputs, **kwargs)¶
Apply the activation layer.
- Parameters
inputs – Arbitrary tensor.
- Returns
A tensor of the same shape as the input.
- Return type
Tensor
- static gelu(x)¶
- static hard_swish(x)¶
- static relu6(x)¶