Designed as an interesting and fun way to get started with reinforcement learning (RL), the AWS DeepRacer is offerd as the fastest way to get started with machine learning (ML). Users can train RL models with the robot vehicle in a cloud-based virtual simulator and compete for prizes in the global AWS DeepRacer League.
Now, says the company, it's expanding AWS DeepRacer's ability to provide fun, hands-on learning by open-sourcing the AWS DeepRacer device software.
"Because the AWS DeepRacer is an Ubuntu-based computer on wheels powered by the Robot Operating System (ROS) we are able to open source the code, making it straightforward for a developer with basic Linux coding skills to prototype new and interesting uses for their car," says David Smith, Sr. Solutions Architect for AWS DeepRacer. "Now that the AWS DeepRacer device software is openly available, anyone with the car and an idea can make new uses for their device a reality."
The company says it has compiled six sample projects from the AWS DeepRacer team and members of the global AWS DeepRacer community to help users get started exploring the possibilities that open source provides:
- Follow the Leader - Learn to deploy an object detection model that enables the AWS DeepRacer device to identify and follow an object.
- RoboCat - Community project that leverages OpenCV image-processing capabilities to detect a mouse in the image from an infrared camera mounted on the AWS DeepRacer.
- Mapping - Use AWS DeepRacer to draw a map with SLAM (Simultaneous Localization and Mapping), a technique for creating a map of an environment by estimating a device’s current location as it moves through a space.
- Off road - Use a series of QR codes as waypoints to navigate the AWS DeepRacer around a custom path.
- DeepDriver - Mimics a real-world car that starts and stops at traffic lights and stop signs.
- DeepBlaster - Use object detection models to build a toy that