날짜를 저장하십시오! Google I / O가 5 월 18 일부터 20 일까지 반환됩니다. 지금 등록

Module: tfa.activations

Additional activation functions.

Functions

gelu(...): Gaussian Error Linear Unit.

hardshrink(...): Hard shrink function.

lisht(...): LiSHT: Non-Parameteric Linearly Scaled Hyperbolic Tangent Activation Function.

mish(...): Mish: A Self Regularized Non-Monotonic Neural Activation Function.

rrelu(...): Randomized leaky rectified liner unit function.

snake(...): Snake activation to learn periodic functions.

softshrink(...): Soft shrink function.

sparsemax(...): Sparsemax activation function.

tanhshrink(...): Tanh shrink function.