This paper presents a novel knowledge distillation based model compression framework consisting of a student ensemble. It enables distillation of simultaneously learnt ensemble knowledge onto each of the compressed student models. Each model learns …
Convolutional neural networks (CNN) are capable of learning robust representation with different regularization methods and activations as convolutional layers are spatially correlated. Based on this property, a large variety of regional dropout …
Artifical Intelligence based iOS application for detecting driver drowsiness
Research on path forecasting for self driving vehicle using CARLA simulator
Research on infusing attention mechanism for combined Deep Audio-Visual Models