Path: blob/master/examples/keras_recipes/md/antirectifier.md
3508 views
Simple custom layer example: Antirectifier
Author: fchollet
Date created: 2016/01/06
Last modified: 2023/11/20
Description: Demonstration of custom layer creation.
Introduction
This example shows how to create custom layers, using the Antirectifier layer (originally proposed as a Keras example script in January 2016), an alternative to ReLU. Instead of zeroing-out the negative part of the input, it splits the negative and positive parts and returns the concatenation of the absolute value of both. This avoids loss of information, at the cost of an increase in dimensionality. To fix the dimensionality increase, we linearly combine the features back to a space of the original size.
Setup
The Antirectifier layer
To implement a custom layer:
Create the state variables via
add_weight()
in__init__
orbuild()
. Similarly, you can also create sublayers.Implement the
call()
method, taking the layer's input tensor(s) and return the output tensor(s).Optionally, you can also enable serialization by implementing
get_config()
, which returns a configuration dictionary.
See also the guide Making new layers and models via subclassing.
Let's test-drive it on MNIST
[0.19070196151733398, 0.9740999937057495]