Final answer:
1) ReLU. The activation function that needs to be implemented is ReLU. ReLU is commonly used in the hidden layers of neural networks.
Step-by-step explanation:
The activation function that needs to be implemented is ReLU (Rectified Linear Unit). The ReLU activation function is commonly used in the hidden layers of neural networks. It is defined as f(x) = max(0, x), where x is the input to the function. ReLU returns 0 for negative inputs and the input value for positive inputs.
ReLU forward pass:
- Initialize an output matrix of the same size as the input matrix.
- For each element in the input matrix, if the value is positive, assign that value to the corresponding element in the output matrix. Otherwise, assign 0.
- Return the output matrix as the result of the forward pass.
ReLU backward pass:
- Initialize a gradient matrix of the same size as the input matrix.
- For each element in the input matrix, if the value is positive, assign 1 to the corresponding element in the gradient matrix. Otherwise, assign 0.
- Multiply the gradient matrix element-wise with the incoming gradient from the next layer.
- Return the gradient matrix as the result of the backward pass.