108k views
3 votes
In this question, you will implement the layers needed for basic classification neural networks. For each part, you will be asked to 1) derive the gradients and 2) write the matching code. Keep in mind that your solution to all layers must operate on mini-batches of data and should not use loops to iterate over the training points individually. 3.1 Activation Functions: First, you will implement an activation function in ____. You will implement the forward and backward passes for the ReLU activation function, commonly used in the hidden layers of neural networks. What is the activation function you need to implement?

1) ReLU
2) Sigmoid
3) Tanh
4) Softmax

1 Answer

3 votes

Final answer:

1) ReLU. The activation function that needs to be implemented is ReLU. ReLU is commonly used in the hidden layers of neural networks.

Step-by-step explanation:

The activation function that needs to be implemented is ReLU (Rectified Linear Unit). The ReLU activation function is commonly used in the hidden layers of neural networks. It is defined as f(x) = max(0, x), where x is the input to the function. ReLU returns 0 for negative inputs and the input value for positive inputs.

ReLU forward pass:

  1. Initialize an output matrix of the same size as the input matrix.
  2. For each element in the input matrix, if the value is positive, assign that value to the corresponding element in the output matrix. Otherwise, assign 0.
  3. Return the output matrix as the result of the forward pass.

ReLU backward pass:

  1. Initialize a gradient matrix of the same size as the input matrix.
  2. For each element in the input matrix, if the value is positive, assign 1 to the corresponding element in the gradient matrix. Otherwise, assign 0.
  3. Multiply the gradient matrix element-wise with the incoming gradient from the next layer.
  4. Return the gradient matrix as the result of the backward pass.
User Evil Activity
by
8.6k points