Knowledge Distillation for the Automotive Domain

Master Thesis

Motivation and Goals

With the increased use of Neural Network models in the Automotive industry on solving complicating issues with sophisticated architecture. It becomes difficult to implement these models directly in the vehicle’s onboard computer.

With that in mind, the objective of this work is to explore different techniques and strategies to reduce the size of the state-of-the-art models used to create Virtual Sensors using techniques, such as, Pruning, quantization or Knowledge Distillation, while maintaining the performance of the original model.

Requirements

  • Currently studying master’s in informatics or equivalent studies.
  • Have knowledge in Deep learning models.
  • Have experience in python programming language.
  • Familiar with at least one Deep learning framework, such as, TensorFlow, Keras or Pytorch.
  • Your work is independent, structured, and responsible.