Anant Agarwal

Graduate Student
North Carolina State University

"Stay Hungry, Saty Foolish" - Steve Jobs





Work Experience

VMware Inc., Palo Alto, USA

Member of Technical Staff, Intern, High Availability Team

Summer 2014

Project Description - I am currently working with the High Availability team on their distributed VM availability services for vSphere.

North Carolina State University

Teaching Assistant, Introduction to Artificial Intelligence(CSC 411)

Fall 2013

Description - Worked as a teaching assitant under Dr. Robert St. Amant for his undergraduate course - "Introduction to Artificial Intelligence" (CSC 411).

Indian Institute of Technology, Delhi

Research Intern working with Dr. K. K. Biswas

Summer 2012

Project Description -

A Hybrid Random Fourier Features for Large Kernel Machines - Designed and implement a modified version of Localized Linear Support Vector Machine (SVM) which is faster and more accurate than standard libraries. The modified SVM first clusters the data using K-NN and K-Means clustering algorithm and then applies a Sequential Minimal Optimization (SMO) based SVM on each cluster, thereby deriving a consensus SVM model that is later used for classification and prediction purpose. Random Fourier Features were used for feature selection and optimization. Results showed about 10-20% improvement in the classification time and the system achieved marginal improvement in classification accuracy as well, under experiments that involved running this tool against available libraries like LIBSVM and LLSVM on standard benchmark datasets. [Report]

Jaypee Institute of Information Technology, India

Teaching Assistant, Data Structures Lab

Spring 2013

Description - Worked as a teaching assitant under Dr. Manish Thakur for his undergraduate lab course - "Data Structures".

Indian Institute of Technology, Delhi

Research Intern working with Dr. K. K. Biswas

Summer 2011

Project Description -

Implementing Sequential Minimal Optimization (SMO) using a Hash Table - I designed and implemented the Sequential Minimal Optimization (SMO) algorithm for training SVM using a "hash table" and show improved results over an array implementation. Final analysis was done by running the modified SMO based SVM over standard datasets and showing improved results in terms of computation time when compared to standard libraries like LIBSVM. Results showed around 30% improvement in the classification time. [Report]