softplus(value) Softplus function. Parameters: value: float, value to process. Returns: float
softsign(value) Softsign function. Parameters: value: float, value to process. Returns: float
elu(value, alpha) Exponential Linear Unit (ELU) function. Parameters: value: float, value to process. alpha: float, default=1.0, predefined constant, controls the value to which an ELU saturates for negative net inputs. . Returns: float
selu(value, alpha, scale) Scaled Exponential Linear Unit (SELU) function. Parameters: value: float, value to process. alpha: float, default=1.67326324, predefined constant, controls the value to which an SELU saturates for negative net inputs. . scale: float, default=1.05070098, predefined constant. Returns: float
exponential(value) Pointer to math.exp() function. Parameters: value: float, value to process. Returns: float
function(name, value, alpha, scale) Activation function. Parameters: name: string, name of activation function. value: float, value to process. alpha: float, default=na, if required. scale: float, default=na, if required. Returns: float
derivative(name, value, alpha, scale) Derivative Activation function. Parameters: name: string, name of activation function. value: float, value to process. alpha: float, default=na, if required. scale: float, default=na, if required. Returns: float
In true TradingView spirit, the author has published this Pine code as an open-source library so that other Pine programmers from our community can reuse it. Cheers to the author! You may use this library privately or in other open-source publications, but reuse of this code in a publication is governed by House rules.
Disclaimer
The information and publications are not meant to be, and do not constitute, financial, investment, trading, or other types of advice or recommendations supplied or endorsed by TradingView. Read more in the Terms of Use.