Skip to main content
Open Access Publications from the University of California

UC Berkeley

UC Berkeley Electronic Theses and Dissertations bannerUC Berkeley

Towards Cooperative SLAM for Low-Cost Biomimetic Robots


Bio-inspired millirobots exhibit unique locomotion modalities that allow them to traverse complex, unstructured terrain that traditional robots cannot.

This makes them potentially useful in urban search and rescue, structure inspection, environmental monitoring, and surveillance, all of which require some level of situational awareness.

The low-cost, lightweight design and difficult-to-model dynamics of these robots also create unique challenges when applying state-of-the-art Simultaneous Localization and Mapping (SLAM) approaches usually used to gain this awareness.

In this thesis, we develop a collection of estimation and control techniques that addresses these challenges, allowing teams of millirobots to localize within and map complex, unstructured environments.

The analysis covers several facets of the problem, including low-cost millirobot team design, motion modeling, cooperative state estimation, and mapping.

We first show the utility of disposable low-cost robots in hazardous environments by using teams of “picket” and “observer” robots for exploration.

Next, we explore a data-driven motion modeling approach to approximate the non-linear stochastic dynamics of legged millirobots.

Then we develop an inter-robot pose estimation technique using monocular vision and active markers that can operate in visually feature-poor environments, and can scale to teams of computationally constrained robots.

We demonstrate this technique first on autonomous underwater vehicles, and then extend it to a team of ground robots cooperatively navigating three-dimensional terrain.

Finally, we explore a simple scanning laser technique that can leverage cooperating robots with cameras to map an environment.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View