Exciting news! Gradient has launched a FREE GPU plan. Read More
Project Details

Funnel Activation (FReLU) for Visual Recognition

See the novel FReLU activation function in action with PyTorch

By
The Gradient Team

Description

FReLU (Funnel Activation) is a novel activation function in the Rectified Linear Unit (ReLU) family, introduced at ECCV 2020. FReLU extends ReLU and PReLU to a 2D activation by adding a negligible overhead of spatial condition.  

In this notebook we'll implement, train, and test the performance of the FReLU activation function for various different ResNet architectures using PyTorch.

For a more detailed breakdown of the theory behind the code, check out the full tutorial on the blog.