Author: samuel.cheng@ou.edu
Pretty neat idea
https://vpr-norman.ou.edu/FY18-Federal-Research-Budgets
Interviews of deep learning icons by Andrew Ng (he himself of course is an icon also). Highly recommend. Especially interviews of the big three (Hinton, Bengio, LeCun).
Some interesting excerpt of this
…
As glimpses of meta-learning, I was especially fascinated with Ng’s lectures and labs for:
- Face Recognition, reusing pre-trained models to ‘transfer’ its weights to a new application.
- Neural Style Transfer, teasing the cost function to balance content with style activations.
- Jazz Solo, tricking a pre-trained model to generate a likely sequence of input data.
- Debiasing Word Vectors, detecting and correcting sexual bias with analogies.
- Language Translation, enhancing by managing the attention placed nodes.
My fascination has motivated me to learn about various meta-learning approaches, such as:
- AlphaGo Zero demonstrating that good simulations of the problem domain (like the game Go) can be surrogates for generating labeled data, using Generative Adversarial Networks (GAN).
- Capsule Networks (CapsNet) introduced by Hinton to correct flaws in image classifier.
- CoDeepNEAT optimization of DNN typology, components, and hyperparameters via genetic evolution algorithms.
The insight is that one should eagerly explore the new secret sauce for DL — meta-learning.
A nice compliation
A nice summary