227k views
2 votes
Suppose you use Batch Gradient Descent to train a neural network and you plot the training error at every epoch. If you notice that the training error consistently goes up, what is likely going on? How can you fix this?

1 Answer

3 votes

Answer:

The answer is "using validation error".

Step-by-step explanation:

The validation error is used to response the test for one of the queries is activated to the participant, which may not properly answer the question. These errors go up continuously after each time, the processing rate is too high and also the method is different.

  • These errors are also unless to increase when they are actually in the problem.
  • The training level will be that, if the learning error may not increase when the model overrides the learning set and you should stop practicing.
User Jonp
by
5.3k points