- my battery seems fine (radio lights work fine) but the car will not start
- Replacing variables on a page using a module
- Signaturefield Module
- How To Design Drupal 8 Parallax Theme Demo Site
- Meaning of “time in zone”
- 2D Bin Packing Algorithm Implementation
- Output the most used non-trivial words from a file
- Counting the maximum number of four-in-a-row wins
- Overhauled AST builder for markargs
- SASS/SCSS Mixin for Button States - Could it be DRY-er?
- How big was the Vulcan state before they joined the Federation?
- Nacelles: if two are better than one, then why are three not better than two?
- Name of a 90s animation about a boy and a whale?
- Which dragon did this happen to?
- Why did he choose this target?
- can Iron Fist's fist shattered Wolverine's claws?
- Book - Like on Earth is an alien purgatory
- Searching for book about an alien invasion and reappearance of Elves
- Why don't the white walkers have archers in their armies?
- Mr. Smith cloning himself in the matrix
Right way to Fine Tune - Train a fully connected layer as a separate step
I'm using Fine Tuning with caffenet and it works really well but then I read this in Keras blog entry on Fine Tuning (They use a trained VGG16 model):
"in order to perform fine-tuning, all layers should start with properly trained weights:
for instance you should not slap a randomly initialized fully-connected network on top of a pre-trained convolutional base.
This is because the large gradient updates triggered by the randomly initialized weights would wreck the learned weights in the convolutional base.
In our case this is why we first train the top-level classifier, and only then start fine-tuning convolutional weights alongside it."
So as a separate step in Fine tuning they save the output of the last layer before the fully connected layer (the "bottleneck features") and then they train a "small fully-connected model" on those features and only then they put the newly trained fully connected layer on top of the whole net and train the "last convolutional block".