Verbal Chess using Computer Vision with the Baxter Research Robot
thesisposted on 28.04.2018, 00:00 by Zephaniah Connell, Connor Desmond, Ryan Cook
The Baxter robotic system is an extremely sophisticated piece of machinery, equipped with a myriad of sensors and features. As of yet, very little research has been accomplished utilizing Baxter by students or faculty in the Department of Electrical and Computer Engineering at the University of Wyoming. This project is a base that will enable future employment of Baxter for more intricate and advanced research topics. This project was derived to showcase a large portion of Baxter's functionality in an easily digestible and potentially expandable format. It will display a convenient form of user interaction (voice commands), utilization of computer vision (detecting chess board and pieces), and safe and precise physical interactions with or near humans (moving chess pieces).The goal of this project is to enable the Baxter robotic system to move chess pieces on a chess board based on user input in the form of voice commands. This can be broken up into four main parts: physical movement of Baxter's appendages, computer vision to locate the board and chess pieces, voice recognition for the necessary set of commands, and internal chess board state information and chess logic.