Machine Learning Datasets and Algorithms through "Duckietown" Vision Systems
Session Number
Project ID: CMPS 6
Advisor(s)
Dr. Matthew Walter; Toyota Technical Institute at Chicago, Robotics Intelligence through Perception Lab
Discipline
Computer Science
Start Date
22-4-2020 10:05 AM
End Date
22-4-2020 10:20 AM
Abstract
In order for robots to safely navigate their environments, they must be able to utilize a variety of sensory input to determine proper actions to take. Through the use of advanced perception algorithms, robots are able to analyze multi-modal observations about their environment. The Duckietown project serves as a way to simulate this artificial intelligence process on a simpler and accessible platform. The Duckietown platform uses a tangible robot comprising of a Raspberry Pi and PiCamera along with a chassis. The other primary aspect of the project are the Duckietowns themselves, which are environments filled with other Duckiebots trying to traverse a cityscape. I work to better understand the software platform created for Duckietown, initializing and controlling the robot using Python through Ubuntu Linux. We heavily utilize ROS (Robotics Operating System) and Docker (a container platform). The Duckietown Project was conceived as an MIT graduate class in 2016 and has since then become a worldwide multi-university program. I have been attempting to use this platform to create an image dataset. Using other datasets from Kaggle, a machine learning/data science community, I have been working to develop different image processing algorithms to recognize patterns. This project is building up to incorporating this image recognition system into a Duckiebot and running it live.
Machine Learning Datasets and Algorithms through "Duckietown" Vision Systems
In order for robots to safely navigate their environments, they must be able to utilize a variety of sensory input to determine proper actions to take. Through the use of advanced perception algorithms, robots are able to analyze multi-modal observations about their environment. The Duckietown project serves as a way to simulate this artificial intelligence process on a simpler and accessible platform. The Duckietown platform uses a tangible robot comprising of a Raspberry Pi and PiCamera along with a chassis. The other primary aspect of the project are the Duckietowns themselves, which are environments filled with other Duckiebots trying to traverse a cityscape. I work to better understand the software platform created for Duckietown, initializing and controlling the robot using Python through Ubuntu Linux. We heavily utilize ROS (Robotics Operating System) and Docker (a container platform). The Duckietown Project was conceived as an MIT graduate class in 2016 and has since then become a worldwide multi-university program. I have been attempting to use this platform to create an image dataset. Using other datasets from Kaggle, a machine learning/data science community, I have been working to develop different image processing algorithms to recognize patterns. This project is building up to incorporating this image recognition system into a Duckiebot and running it live.