# ReLU

A Rectified Linear Unit, short ReLU, is an operation that takes as input a tensor and outputs a tensor of the same size. It performs the operation $$max(0,x)$$ on each element of the input tensor.

## PyTorch Usage

>>> m = nn.ReLU()
>>> input = torch.randn(128, 20)
>>> output = m(input)
>>> print(output.size())


See torch.nn.ReLU and Wikipedia for more details.