Deep Issues Lurking Under Deep Learning:

Some interesting excerpt of this

As glimpses of meta-learning, I was especially fascinated with Ng’s lectures and labs for:

  • Face Recognition, reusing pre-trained models to ‘transfer’ its weights to a new application.
  • Neural Style Transfer, teasing the cost function to balance content with style activations.
  • Jazz Solo, tricking a pre-trained model to generate a likely sequence of input data.
  • Debiasing Word Vectors, detecting and correcting sexual bias with analogies.
  • Language Translation, enhancing by managing the attention placed nodes.

My fascination has motivated me to learn about various meta-learning approaches, such as:

  • AlphaGo Zero demonstrating that good simulations of the problem domain (like the game Go) can be surrogates for generating labeled data, using Generative Adversarial Networks (GAN).
  • Capsule Networks (CapsNet) introduced by Hinton to correct flaws in image classifier.
  • CoDeepNEAT optimization of DNN typology, components, and hyperparameters via genetic evolution algorithms.

The insight is that one should eagerly explore the new secret sauce for DL — meta-learning.

YOLOv3

YOLOv3 is out. See paper here and code here. The YOLO paper series is always fun to read. 🙂 Some highlights are

  • It is worse than RetinaNet but is 3.8 times faster
  • Comparable as SSD but 3 times faster
  • One thing interesting thing mentioned in the paper is that focal cost doesn’t seem to help
Copyright OU-Tulsa Lab of Image and Information Processing 2024
Tech Nerd theme designed by Siteturner