119k views
1 vote
The following question will ask you about the below neural network, where we set w₀ = -5, w₁ = 2, w₂ = -1, and w₃ = 3 . x₁, x₂, and x₃ represent input neurons, and y represents the output neuron. What value will this network compute for y given inputs x₁ = 3, x₂ = 2, and x₃ = 4 if we use a step activation function? What if we use a ReLU activation function?

a) 0 for step activation function, 0 for ReLU activation function
b) 0 for step activation function, 1 for ReLU activation function
c) 1 for step activation function, 0 for ReLU activation function
d) 1 for step activation function, 1 for ReLU activation function

1 Answer

1 vote

Final answer:

The value of y in the given neural network would be 1 with a step activation function and 8 with a ReLU activation function.

Step-by-step explanation:

To compute the value of y in the given neural network using a step activation function, we need to calculate the weighted sum of the inputs and apply the step function to the result. The weighted sum is calculated as follows: w₀ + w₁*x₁ + w₂*x₂ + w₃*x₃ = -5 + 2*3 + (-1)*2 + 3*4 = 8. Since the step function returns 0 for values less than or equal to 0, and 1 for values greater than 0, the output y is 1.

If we use a ReLU (Rectified Linear Unit) activation function, the weighted sum is calculated in the same way. However, the ReLU function returns the input value for positive inputs, and 0 for negative inputs. Therefore, the output y will be 8 because it is positive.

User Kshitij Godara
by
8.1k points