Some interesting excerpt of this
…
As glimpses of meta-learning, I was especially fascinated with Ng’s lectures and labs for:
- Face Recognition, reusing pre-trained models to ‘transfer’ its weights to a new application.
- Neural Style Transfer, teasing the cost function to balance content with style activations.
- Jazz Solo, tricking a pre-trained model to generate a likely sequence of input data.
- Debiasing Word Vectors, detecting and correcting sexual bias with analogies.
- Language Translation, enhancing by managing the attention placed nodes.
My fascination has motivated me to learn about various meta-learning approaches, such as:
- AlphaGo Zero demonstrating that good simulations of the problem domain (like the game Go) can be surrogates for generating labeled data, using Generative Adversarial Networks (GAN).
- Capsule Networks (CapsNet) introduced by Hinton to correct flaws in image classifier.
- CoDeepNEAT optimization of DNN typology, components, and hyperparameters via genetic evolution algorithms.
The insight is that one should eagerly explore the new secret sauce for DL — meta-learning.