aspen² slam robot
I built a SLAM (Simultaneous Localization and Mapping) robot that navigates to whatever room you tell it to (with your voice).
The idea with this project was to demonstrate a distributed intelligence architecture, with a small keyword-spotting neural network running on a Cortex M7 and reporting its inferences to the main mission computer (effectively offloading the compute from the main computer).
BOM:
- Raspberry Pi 4 8GB running ROS2 packages:
- RPLIDAR
- ODrive v3.6 24V motor controller
- 2 x Brushless "Robowheel" hub motors
- 18V Makita battery
- Arduino Mega
- Handles RC manual control over UART <> ODrive
- Arduino Portenta H7 + Vision Shield
- This was the TinyML edge device. I had it running a word recognition model that enabled commanding the robot to navigate to different rooms around the house with a voice command.