Unleashing the Power of ReLU Activation Function in Neural Networks