DL - a subfield of ML that involves building and training artificial neural networks with multiple layers. The term "deep" refers to the depth of the network, which means that it has multiple layers of interconnected nodes that allow it to learn complex patterns and relationships in data.
- Difference from classic ML
- Pros and cons
FULLY CONNECTED LAYER
Each neuron is connected to every neuron in the previous layer. In other words, all the outputs of the previous layer are connected to the inputs of the current layer.
In a fully connected layer, each neuron computes a weighted sum of the inputs from the previous layer, adds a bias term, and applies an activation function to produce the output. The weights and biases of the neurons are learned during the training process, using techniques such as backpropagation.
Tricky questions about it:
- What is the difference between a fully connected layer and a convolutional layer?
- Why is the number of neurons in a fully connected layer often much larger than the number of neurons in a convolutional layer?
- Can a fully connected layer be used for image classification?
- How does the number of parameters in a fully connected layer relate to the number of input and output neurons?
- Is it possible to have a fully connected layer with a different number of input and output neurons?
- How does regularization affect the weights in a fully connected layer?
- What happens when the input to a fully connected layer is very large or very small compared to the weights?
- Can a fully connected layer be used in a recurrent neural network? If so, how?
ACTIVATION FUNCTIONS