Jeremy did a more in-depth run through of the ConvNet pipeline in excel.
Architectures that make use of fully connected layers like VGG have a lot of weights which also means they could have trouble overfitting and they can also be very slow to train.
The fast.ai library automatically switches into multi-label mode for multi-label classification problems by switching the final activation from softmax to sigmoid
Highlights
Some Useful links
Architectures that make use of fully connected layers like VGG have a lot of weights which also means they could have trouble overfitting and they can also be very slow to train.
The fast.ai library automatically switches into multi-label mode for multi-label classification problems by switching the final activation from softmax to sigmoid
- Using differential learning rates applies your array of learning rates across different layer groups which are handled automatically by the fast.ai library. The last learning rate is typically applied to your fully connected layer while the other n-1 learning rates are applied evenly over the layer groups in your network.
- data.resize() reduces the input image sizes which helps to speed up your training especially if you're working with large images
- You can define custom metrics within the fast.ai library
- You typically want to fine-tune your fully connected layers first if you're using a pre-trained network before you unfreeze the pre-trained weights in your convolution layers.
- learn.summary() shows you the model architecture summary
- add_datepart() pulls interesting features from time series data
- pandas feather data format is a really efficient way to dump data in binary format
- CurlWget chrome extension
- FileLink
- Intuitive Understanding of ConvNets by Otavio Good
- Entity Embedding of Categorical Variables
No comments:
Post a Comment