mlpack  3.0.4
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Macros Pages
Public Member Functions | List of all members
ELU< InputDataType, OutputDataType > Class Template Reference

The ELU activation function, defined by. More...

Public Member Functions

 ELU ()
 Create the ELU object. More...
 
 ELU (const double alpha)
 Create the ELU object using the specified parameter. More...
 
double const & Alpha () const
 Get the non zero gradient. More...
 
double & Alpha ()
 Modify the non zero gradient. More...
 
template<typename DataType >
void Backward (const DataType &&input, DataType &&gy, DataType &&g)
 Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f. More...
 
OutputDataType const & Delta () const
 Get the delta. More...
 
OutputDataType & Delta ()
 Modify the delta. More...
 
template<typename InputType , typename OutputType >
void Forward (const InputType &&input, OutputType &&output)
 Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. More...
 
InputDataType const & InputParameter () const
 Get the input parameter. More...
 
InputDataType & InputParameter ()
 Modify the input parameter. More...
 
double const & Lambda () const
 Get the lambda parameter. More...
 
OutputDataType const & OutputParameter () const
 Get the output parameter. More...
 
OutputDataType & OutputParameter ()
 Modify the output parameter. More...
 
template<typename Archive >
void serialize (Archive &ar, const unsigned int)
 Serialize the layer. More...
 

Detailed Description

template<typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
class mlpack::ann::ELU< InputDataType, OutputDataType >

The ELU activation function, defined by.

\begin{eqnarray*} f(x) &=& \left\{ \begin{array}{lr} x & : x > 0 \\ \alpha(e^x - 1) & : x \le 0 \end{array} \right. \\ f'(x) &=& \left\{ \begin{array}{lr} 1 & : x > 0 \\ y + \alpha & : x \le 0 \end{array} \right. \end{eqnarray*}

For more information, read the following paper:

* @article{Clevert2015,
* author = {Djork{-}Arn{\'{e}} Clevert and Thomas Unterthiner and
* Sepp Hochreiter},
* title = {Fast and Accurate Deep Network Learning by Exponential Linear
* Units (ELUs)},
* journal = {CoRR},
* year = {2015}
* }
*

The SELU activation function is defined by

\begin{eqnarray*} f(x) &=& \left\{ \begin{array}{lr} \lambda * x & : x > 0 \\ \lambda * \alpha(e^x - 1) & : x \le 0 \end{array} \right. \\ f'(x) &=& \left\{ \begin{array}{lr} \lambda & : x > 0 \\ \lambda * (y + \alpha) & : x \le 0 \end{array} \right. \end{eqnarray*}

For more information, read the following paper:

* @article{Klambauer2017,
* author = {Gunter Klambauer and Thomas Unterthiner and
* Andreas Mayr},
* title = {Self-Normalizing Neural Networks},
* journal = {Advances in Neural Information Processing Systems},
* year = {2017}
* }
*
Note
Make sure to use SELU activation function with normalized inputs and weights initialized with Lecun Normal Initialization.
Template Parameters
InputDataTypeType of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).
OutputDataTypeType of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).

Definition at line 105 of file elu.hpp.

Constructor & Destructor Documentation

ELU ( )

Create the ELU object.

NOTE: Use this constructor for SELU activation function.

ELU ( const double  alpha)

Create the ELU object using the specified parameter.

The non zero gradient for negative inputs can be adjusted by specifying the ELU hyperparameter alpha (alpha > 0).

Note
Use this constructor for ELU activation function.
Parameters
alphaScale parameter for the negative factor.

Member Function Documentation

double const& Alpha ( ) const
inline

Get the non zero gradient.

Definition at line 163 of file elu.hpp.

double& Alpha ( )
inline

Modify the non zero gradient.

Definition at line 165 of file elu.hpp.

void Backward ( const DataType &&  input,
DataType &&  gy,
DataType &&  g 
)

Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f.

Using the results from the feed forward pass.

Parameters
inputThe propagated input activation.
gyThe backpropagated error.
gThe calculated gradient.
OutputDataType const& Delta ( ) const
inline

Get the delta.

Definition at line 158 of file elu.hpp.

OutputDataType& Delta ( )
inline

Modify the delta.

Definition at line 160 of file elu.hpp.

void Forward ( const InputType &&  input,
OutputType &&  output 
)

Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f.

Parameters
inputInput data used for evaluating the specified function.
outputResulting output activation.
InputDataType const& InputParameter ( ) const
inline

Get the input parameter.

Definition at line 148 of file elu.hpp.

InputDataType& InputParameter ( )
inline

Modify the input parameter.

Definition at line 150 of file elu.hpp.

double const& Lambda ( ) const
inline

Get the lambda parameter.

Definition at line 168 of file elu.hpp.

OutputDataType const& OutputParameter ( ) const
inline

Get the output parameter.

Definition at line 153 of file elu.hpp.

OutputDataType& OutputParameter ( )
inline

Modify the output parameter.

Definition at line 155 of file elu.hpp.

void serialize ( Archive &  ar,
const unsigned  int 
)

Serialize the layer.


The documentation for this class was generated from the following file: