195k views
4 votes
According to the following simple neural network model, please develop your OWN program function by using Back Propagation to update weights in the model.

A neural network with: 2 inputs(A, B), 3 hidden layer neurons (M, N, O), 3 output neurons (X,Y,Z). Here are the instructions: Please declare a function name as "CustomBPweight()" to accept 5 inputs. There are 5 inputs to the functions: Please declare a target output list name as "TargetOutput" (Given). Please declare a current output list name as "CurrentOutput" (Given) Please declare a neuron output list name as "NeuronOutput" (Given) Please declare a current weights list name as "CurrentWeights"(estimate them) Please declare a learning rate variable as "LearningRate" (Given). The function "CustomBPweight()" should return an output (a list with all updated weights). Please declare the updated weights list name as "UpdateWeights". For all given data, the value of all elements are between 0 and 1. Please identify the total number with weights according to the given model

1 Answer

2 votes

Final answer:

To create a function called CustomBPweight that performs weight updates using backpropagation for a neural network, you need to calculate gradients and update all 15 connection weights using gradient descent.

Step-by-step explanation:

To address the question regarding the development of a backpropagation function for a simple neural network, one must understand the concept of updating weights through gradient descent. The given model includes 2 inputs, 3 hidden neurons, and 3 output neurons. In total, we need to determine the number of weights for this network. Each connection between neurons has a corresponding weight, so there are (2 inputs × 3 hidden neurons) + (3 hidden neurons × 3 output neurons) = 6 + 9 = 15 weights to update.

The CustomBPweight() function would involve calculating the output of the network, computing the error between the current output and target output, and then using that error to perform gradient descent and backpropagate the error to update each weight. Given that we have 3 layers (input, hidden, and output), the function would also need to calculate the gradients for each neuron's output with respect to the loss, and then use these gradients to update the weights accordingly by subtracting the product of the gradient and the learning rate from the current weight values.

Below is a high-level conceptual representation of the function:

def CustomBPweight(TargetOutput, CurrentOutput, NeuronOutput, CurrentWeights, LearningRate):
# Calculate error gradients for output layer
# Backpropagate error to hidden layer
# Update weights for input-to-hidden layer
# Update weights for hidden-to-output layer
# Return updated weights in a list
UpdateWeights = []
# Logic for gradient descent and weight update goes here
return UpdateWeights
User Dotixx
by
8.1k points