Final answer:
Neural Networks have unlimited hidden layers, can solve nonlinear problems, and do not always require activation functions in each layer, but statement A is false as the neuron size can impact performance.
Step-by-step explanation:
When considering the true statements about Neural Networks, we can assess the given options:
- Performance and Neuron Size: It is not accurate to say that the performance of Neural Networks is irrelevant to the size of neurons in layers (meaning the number of neurons). The architecture, including the size and number of neurons in each layer, can significantly impact performance.
- Hidden Layers: Neural Networks indeed do not limit the number of hidden layers. Theoretically, you can have as many hidden layers as needed for the complexity of the problem.
- Nonlinear Problems: Neural Networks are particularly known for their ability to solve nonlinear problems, which is one of their strengths in various complex domains such as image and speech recognition.
- Activation Functions: Activation functions are crucial in Neural Networks, as they introduce non-linearity to the model. However, it is true that not all layers necessarily require activation functions, for example, the input layer or certain types of networks designed for specific purposes.
Therefore, the true statements about Neural Networks from the given options are:
- B. Neural Networks does not limit the number of hidden layers.
- C. Neural Networks solve nonlinear problems.
- D. Neural Networks do not always require activation functions in each layer.