Package dk.alexandra.fresco.stat.mlp.activationfunction

  • Class Summary 
    Class Description
    Relu
    Compute the rectified linear function f(x) = x if x > 0 and f(x) = 0 otherwise.
    ReluDerivative
    Compute the derivative of the Relu function, eg.
    Sigmoid
    Compute the sigmoid (logistic) function f(x) = 1 / (1 + e-x).
    SigmoidDerivative
    Compute the derivative of the Sigmoid function f'(x)given the function value y = f(x)in the point.
  • Enum Summary 
    Enum Description
    ActivationFunction
    This enum represents the available activation functions f: Rn→ Rn for use with neural networks.