FReLU (Funnel Activation) is a novel activation function in the Rectified Linear Unit (ReLU) family, introduced at ECCV 2020. FReLU extends ReLU and PReLU to a 2D activation by adding a negligible overhead of spatial condition.
In this notebook we'll implement, train, and test the performance of the FReLU activation function for various different ResNet architectures using PyTorch.
For a more detailed breakdown of the theory behind the code, check out the full tutorial on the blog.