- Determining required motor torque
- From where the interest 3.03% came?
- How to repair or replace the cover to a spiral bound notebook?
- Why was yellow bile/choler believed to cause anger?
- Binding a key sequence for a mode
- Why is my modeline cutting off text on the right side?
- org alters output of bash code blocks when run with :session
- What are the implications of multiple cooperating intelligent species with different spectral ranges for vision?
- What are the implications of being removed from time and/or causality?
- Virtual Reality and Physical Trauma
- Can death create a loophole for two exact duplicate people existing in the same universe?
- Effects of unique magical artefacts on economy?
- Is Sage Panini related to Panis?
- Using XMLReader With Plugin
- Unexpected token “punctuation” of value “.” (“end of statement block” expected).
- How can I create a zip file containing all assets from an entry gallery?
- How do we create a pre-defined Asset upload field in a plugin that integrates with the native Craft Assets file manager?
- The power of zero puzzle
- It's all right here
- Are the image data augmentation generators in Keras randomly applied
Is there an intuitive explanation why some neural networks have more than one fully connected layers?
I have searched online, but am still not satisfied with answers like this and this.
My intuition is that fully connected layers are completely linear. That means no matter how many FC layers are used, the expressiveness is always limited to linear combinations of previous layer. But mathematically, one FC layer should already be able to learn the weights to produce the exactly the same behavior. Then why do we need more? Did I miss something here?
There is a nonlinear activation function inbetween these fully connected layers. Thus the resulting function is not simply a linear combination of the nodes in the previous layer.
There is a nonlinear activation function inbetween these fully connected layers. Thus the resulting function is not simply a linear combination of the nodes in the previous layer.2017-12-25 09:48:33