Robotic Neck with Kinect

This was part of a larger project at UTA Research Institute to develop a humanoid robot for use as a conduit for interaction between therapists and Autistic children. This application requires as close to human-like movement as possible.

As a visiting summer scholar I worked with Nahum Torres to help program servo motors to respond to, and mirror the live movement of a human neck. Motors were placed on a testing platform to replicate the degrees of freedom in a human neck, in positions set for pitch, roll, and yaw respectively. The motors would be manipulated by the user, who’s movements are captured in real-time using the Kinect.


This project was implemented using LabView, Kinect, and Dynamixel servos.