Date and time: 16 June, 2021, Wednesday 1pm in Melbourne time (11am in Perth Time).
Title: Are SVMs still of interest when meeting deep neural networks
Deep neural networks (DNNs) have been a standard recipe for creating state-of-the-art solutions. As a result, models like Support Vector Machines (SVMs) have been overlooked. While the results from DNNs are encouraging, DNNs also come with their large number of parameters in the model and overheads in long training/inference time. SVMs have excellent properties such as convexity, good generality and efficiency. In this talk, I will highlight some of our recent techniques to enhance SVMs with an automatic pipeline which exploits the context of the learning problem. Experimental results show that SVMs with the pipeline is more efficient, while producing better results than the common usage of SVMs. Additionally, we conduct a case study of our solution on a popular sentiment analysis problem—the aspect term sentiment analysis (ATSA) task. The study shows that our SVM based solution can achieve competitive predictive accuracy to DNN (and even majority of the BERT) based approaches. Furthermore, our solution is about 40 times faster in inference and has 100 times fewer parameters than the models using BERT.
Dr. Zeyi Wen is a Lecturer in Computer Science at The University of Western Australia (UWA). Before working at UWA, he was a research fellow at National University of Singapore from 2017 to 2019 and The University of Melbourne from 2015 to 2016 after his PhD completion at The University of Melbourne in 2015. Dr. Wen is a winner of 2019 IEEE Transactions on Parallel and Distributed Systems (TPDS) Best Paper Award. His research work has also led to open-source machine learning systems including ThunderGBM and ThunderSVM. His areas of research include machine learning systems, high-performance computing and data mining.