Event Title

Machine Learning in Autonomous Driving Simulated through Duckietown Platform

Advisor(s)

Dr. Matthew Walter, Robotics Intelligence through Perception Lab, Toyota Technical Institute at Chicago

Location

Room B108-2

Start Date

26-4-2019 10:25 AM

End Date

26-4-2019 10:40 AM

Abstract

In order for robots to safely navigate their environments, they must be able to utilize a variety of sensory input to determine proper actions to take. Through the use of advanced perception algorithms, robots are able to analyze multi-modal observations about their environment. The Duckietown project serves as a way to simulate this artificial intelligence process on a simpler and accessible platform. The Duckietown platform uses a tangible robot comprising of a Raspberry Pi and PiCamera along with a chassis. The other primary aspect of the project are the Duckietowns themselves, which are environments filled with other Duckiebots trying to traverse a cityscape. I work to better understand the software platform created for Duckietown, initializing and controlling the robot using Python through Ubuntu Linux. We heavily utilize ROS (Robotics Operating System) and Docker (a container platform). The Duckietown Project was conceived as an MIT graduate class in 2016 and has since then become a worldwide multi-university program. I also am working towards simplifying this project for use with younger children by compartmentalizing the program into a system akin to programming languages like Scratch and Blockly. This project is a large undertaking that is still in process, encountering and solving problems like providing proper calibration for a simpler system.

Share

COinS
 
Apr 26th, 10:25 AM Apr 26th, 10:40 AM

Machine Learning in Autonomous Driving Simulated through Duckietown Platform

Room B108-2

In order for robots to safely navigate their environments, they must be able to utilize a variety of sensory input to determine proper actions to take. Through the use of advanced perception algorithms, robots are able to analyze multi-modal observations about their environment. The Duckietown project serves as a way to simulate this artificial intelligence process on a simpler and accessible platform. The Duckietown platform uses a tangible robot comprising of a Raspberry Pi and PiCamera along with a chassis. The other primary aspect of the project are the Duckietowns themselves, which are environments filled with other Duckiebots trying to traverse a cityscape. I work to better understand the software platform created for Duckietown, initializing and controlling the robot using Python through Ubuntu Linux. We heavily utilize ROS (Robotics Operating System) and Docker (a container platform). The Duckietown Project was conceived as an MIT graduate class in 2016 and has since then become a worldwide multi-university program. I also am working towards simplifying this project for use with younger children by compartmentalizing the program into a system akin to programming languages like Scratch and Blockly. This project is a large undertaking that is still in process, encountering and solving problems like providing proper calibration for a simpler system.