Chris King
Research Interests
​
Computer Vision, Robotics, Artificial Intelligence, Neural Networks, Genetic Algorithms
​
​
Experience
Research Assistant / University Of Nevada – Reno,NV
Led projects developing robots that autonomously navigate through their environment and interact with humans in real time:
-
Designed multiple systems from the bottom-up including: a real-time visual tracking system; an object identification module; a control algorithms for obstacle avoidance; an algorithm that allowed robots to autonomously pursue humans; a GUI for issuing requests to robots; software to display the results using charts, graphs, and videos.
-
Constructed a neural network evolved using a genetic algorithm that allowed simulated robots to autonomously learn basic control and obstacle avoidance, as well as search, pursuit, and evasion behaviors.
-
Contributed to and authored related publications.
Group Home Facilitator / Manager / Handyman / The Tungland Corporation – Flagstaff, AZ
Provided care, transportation, and support to developmentally disabled adults in a group home setting:
-
Managed staff, coordinated appointments, maintained daily records, and submitted reports.
-
Assisted in home and vehicle maintenance and repair.
-
Ensured residents were intellectually stimulated, physically active, and well taken care of.
Intern / Intel – Phoenix, AZ
Developed a storage-device controller emulator used for testing and debugging storage-device drivers:
-
Authored project documents including: Requirements, High Level Design, Test Plans, and Low Level Design.
-
Used C++ to code a Windows-based prototype of the emulator.
-
Helped port the emulation code to a Linux-based development board.
-
Scripted test cases using Perl and helped debug the emulation.
Undergraduate Teaching Assistant / Arizona State University – Tempe, AZ
Assisted in a course on "Microprocessor System Design":
-
Graded weekly homework assignments for 40-65 students.
-
Assisted students in assembly programming, hardware design, and understanding general concepts.
-
Tested and assisted in the debugging of weekly student programming exercises.
Research Assistant / Florida Atlantic University – Boca Raton, FL
Used EEG and FMRI to conduct experiments that investigated neural pathways involved in the processing of visual information:
-
Installed, modified, and maintained EEG laboratory equipment.
-
Coded stimulus presentation software.
-
Wrote programs in C and Mat-Lab to filter, edit, process, and display EEG data.
Research Assistant / Northern Arizona University – Flagstaff, AZ
Coded stimulus presentation programs for a visual perception laboratory.
​
Education
Doctor of Philosophy, Computer Science and Engineering
University of Nevada – Reno, NV
Advisor: Mircea Nicolescu & Monica Nicolescu
Dissertation: Efficient Object Detection and Tracking Using a Novel MSER-Based Approach
GPA: 3.85/4
Master of Science, Computer Science and Engineering
University of Nevada – Reno, NV
Advisor: Mircea Nicolescu & Monica Nicolescu
Thesis: Vision and Laser-Based Perception for Real-Time Autonomous Robotic Applications
GPA: 3.82/4
​
Bachelor of Science, Computer Science and Engineering
Arizona State University – Phoenix, AZ
Advisor: David Pheanis
GPA: 3.51/4
​
Bachelor of Science, Psychology with an emphasis in Biology
Northern Arizona University – Flagstaff, AZ
Advisor: Kathy Eastwood
GPA: 3.34/4
​
​
Funding Sources
National Science Foundation Award IIS-0546876
Office of Naval Research Award N00014-06-1-0611
Consortium for Embedded and Inter-Networking Technologies Scholarship
​
​
Website
http://number3114.wix.com/projects
​
​
Publications
Espina, M., Grech, R., Jager, D., Remagnino, P., Iocchi, L., Marchetti, L., Nardi, D., Monekesso, D., Nicolescu, M., King, C. (2011). Multi-Robot Teams for Environmental Monitoring. Innovations in Defence Support Systems – Intelligent Paradigms in Security, Springer-Verlag, 183-209.
​
Kelley, R., King, C., Ambardekar, A., Nicolescu, M., Nicolescu, M., & Tavakkoli, A.(2010). Integrating Context into Intent Recognition Systems. Proceedings of the International Conference on Informatics in Control, Automation and Robotics, (pp. 315-320). Funchal, Madeira, Portugal.
Kelley, R., King, C., Tavakkoli, A., Nicolescu, M., Nicolescu, M., & Bebis, G. (2008). An Architecture for Understanding Intent Using a Novel Hidden Markov Formulation. International Journal of Humanoid Robotics - Special Issue on Cognitive Humanoid Robots, 5(2), 203-224.
Kelley, R., Tavakkoli, A., King, C., Ambardekar, A., Nicolescu, M., & Nicolescu, M.(2012). Context-Based Bayesian Intent Recognition. IEEE Transactions on Autonomous Mental Development - Special Issue on Biologically-Inspired Human-Robot Interactions, 4(3), 215-225.
Kelley, R., Tavakkoli, A., King, C., Ambardekar, A., Wigand, L., Nicolescu, M., et al. (2013). Intent Recognition for Human-Robot Interaction. Plan, Activity, and Intent Recognition, Elsevier.
Kelley, R., Tavakkoli, A., King, C., Nicolescu, M., & Nicolescu, M. (2010). Understanding Activities and Intentions for Human-Robot Interaction. Human-Robot Interaction, 288-305.
Kelley, R., Tavakkoli, A., King, C., Nicolescu, M., Nicolescu, M., & Bebis, G. (2008). Understanding Human Intentions via Hidden Markov Models in Autonomous Mobile Robots. Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, (pp. 367-374). Amsterdam, Netherlands.
King, C., Palathingal, X., Nicolescu, M., & Nicolescu, M. ( 2007). A Vision-Based Architecture for Long-Term Human-Robot Interaction. Proceedings of the International Conference on Human-Computer Interaction, (pp. 1-6). Chamonix, France.
King, C., Palathingal, X., Nicolescu, M., & Nicolescu, M. (2007). A Control Architecture for Long-Term Autonomy of Robotic Assistants. Proceedings of the International Symposium on Visual Computing, (pp. 375-384). Lake Tahoe, Nevada.
King, C., Palathingal, X., Nicolescu, M., & Nicolescu, M. (2009). A Flexible Control Architecture for Extended Autonomy of Robotic Assistants. Journal of Physical Agents, 3(2), 59-69.
King, C., Valera, M., Grech, R., Mullen, R., Remagnino, P., Iocchi, L., et al. (2012). Multi-Robot and Multi-Camera Patrolling. Handbook on Soft Computing for Video Surveillance, 255–286.
Tavakkoli, A., Kelley, R., King, C., Nicolescu, M., Nicolescu, M., & Bebis, G. (2007). A Vision-Based Architecture for Intent Recognition. Proceedings of the International Symposium on Visual Computing, (pp. 173-182). Lake Tahoe, Nevada.
Tavakkoli, A., Kelley, R., King, C., Nicolescu, M., Nicolescu, M., & Bebis, G. (2008). A Visual Tracking Framework for Intent Recognition in Videos. Proceedings of the International Symposium on Visual Computing, (pp. 450-459). Las Vegas, Nevada.
​
​