3D Mapping with Raspberry Pi


Simultaneous localization and mapping (SLAM) allows a robot to map its environment while also keeping track of the robot’s own location within the map. ORB-SLAM2 is a type of SLAM which only uses a video stream as input, but it does not have automatic support for the common low-cost Raspberry Pi Camera. This project involved creating a version of the ORB-SLAM2 ROS module that can use the Raspberry Pi Camera as its main input source, as well as integrating AprilTag fiducial detection into the module to allow for future improvements to the mapping and positioning process

Method / Materials

Robot Operating System (ROS) is a framework that allows a server to process robot sensor data and send commands to the robot.

AprilTags are fiducial markers developed by the University of Michigan to allow robots to detect the tags from an image and calculate the tags position and orientation relative to the camera. This project uses the apriltags2_ros ROS module, which communicates the tag detection data to the server running the SLAM module.

ORB-SLAM2 is a SLAM method that uses images from a mobile robot’s video stream to detect static objects in the environment and builds a 3D map by estimating the objects’ positions relative to each other and to the camera based on the objects’ motion between video frames.

The TurtleBot3 Waffle Pi is a small wheeled robot made by Robotis. The TurtleBot uses ROS for all communication and control. The modified ORB-SLAM2 ROS module created for this project can be installed on any robot running ROS on a Raspberry Pi.

  1. Modify the ORB-SLAM2 ROS module to allow video obtained from a Raspberry Pi Camera to be used in creating a 3D map.
  2. Modify the ORB-SLAM2 ROS module to allow video obtained from a Raspberry Pi Camera to be used in creating a 3D map.
Data graph of room
pie scanning box


Goal 1 was tested by creating a 3D map of a small enclosed area. The map and the environment in which the map was made are shown in Figure 1. Despite the inherent difficulties of creating a 3D map with a cheap monocular camera, the map is remarkably similar to the actual environment.

Goal 2 was tested by changing the color of detected Map Points in the ORB-SLAM2 module if the Map Point was found to be in the center of an AprilTag according to the data being transmitted from the AprilTag module. Figure 3 shows the module successfully identifying the center of the AprilTag.


Both the addition of the Raspberry Pi Camera and the connection with the AprilTag detection module were successful. Developers and hobbyists seeking a low-cost 3D SLAM setup can use the new configurations of the ORB-SLAM2 module to get started without needing to spend hundreds of dollars. The AprilTag detection can be used in future improvements on this project to refine map point placement, to optimize loop closing during the mapping process, or in other ways that benefit from the additional measurements given by the detections.

Diagram of system
scanning room detecting data points