Parametric ReLU (PReLU)

Parametric ReLUs are another alternative to normal ReLU units to counter the dying ReLU problem. It is an extension of the Leaky ReLU unit where is a learnable parameter. It is updated using back-propagation and SGD like any other parameter.

The shape of the PReLU function can be seen in the image below:

PyTorch Usage

layer = nn.PReLU()
input = torch.randn(2)
output = layer(input)

Refer torch.nn.PReLU for more details.