See my work
Hello, World! I am Vinaykumar, graduate student in Electrical Engineering Department at University of Southern California, Los Angeles.
My graduate study is concentrated on Robotics and Machine Learning. I am interested in using computer vision and SLAM for mobile robot navigation. I am currently exploring various techniques to use low cost IR and Ultrasonic sensors along with commercial RGBD sensors to make more sense of robot's environment.
I like to spend my free time on the mountains. I like hiking and cycling.
My Latest Resume
Courses at USC |
CSCI545 |
Robotics |
CSCI574 |
Computer Vision |
CSCI599 |
Coordinated Mobile Robotics (Audit) |
CSCI590 |
Directed Research on Mobile Robot Navigation |
EE660 |
Machine Learning from Signals: Foundations and Methods |
EE559 |
Mathematical Pattern Recognition |
EE585 |
Linear Systems Theory |
EE503 |
Probability for Electrical and Computer Engineers |
EE441 |
Applied Linear Algebra for Engineers |
EE590 |
Directed Research on Control of Nano Robots |
Online (MOOC) |
Coursera |
Machine Learning |
Coursera |
Design and Analysis of Algorithms 1 |
EdX |
Scalable Machine Learning |
EdX |
Autonomous Mobile Robots (Current) |
|
- 1) Directed Research on Mobile Robot Navigation
I am currently working under Prof. Laurent Itti at iLab, as a Graduate Directed Researcher. My goal is to use computer vision and SLAM techniques for indoor mobile robot navigation.
- 2) Directed Research on Control of nano robots. (Research Advisor: Prof. Edmond Jonckheere)
Complete report of my directed research:
- CIFAR-10: Object Detection In Images [code]
Compared object detection techniques on CIFAR-10 dataset (SVM, Adaboost and Neural Networks). Achieved 78% accuracy using CNN with Keras and Theano.
Keywords: Keras, Theano, IPython Parallel, Starcluster, AWS, HOG, BOF
- Heterogeneous Coordinated Robots For Navigation
Navigating ground robot (Turtlebot) using live feed data (image) from AR Drone quadcopter. Implemented the idea on ROS and Gazebo.
- Human Activity Recognition From Inertial Sensor Data
Using pattern recognition and machine learning techniques to predict the human activities like walking, standing, sitting and laying. Used mobile phone inertial sensors dataset from UCI repository.
Tools used: Python, Scikit-learn, numpy
- Balancing Nao Robot In One Leg [video]
Using Inverse kinematics, minimum jerk/cubic spline controls, and COG Jacobian techniques to balance the Nao Robot in single leg. Implemented the idea using SL simulator.
- Localization experiments with iRobot Create
Worked on ROS and Localization experiments with iRobot Create - 1 mobile robot platform. Added support for Razor 9 DOF IMU and tested localization in indoor environment.