Aerial Robots from Penn Engineering

RAPID: Aerial Robots for Remote Autonomous Exploration and Mapping

Objectives

Photograph of the aerial vehicle

Kumar Lab of Penn Engineering is interested in exploring the possibility of leveraging an autonomous quadrotor in earthquake-damaged environments through field experiments that focus on cooperative mapping using both ground and aerial robots. Aerial robots offer several advantages over ground robots including the ability to maneuver through complex three-dimensional (3D) environments and gather data from vantages inaccessible to ground robots.

They began by examining earthquake-damaged buildings with multiple floors that could be accessible for ground robots. However various locations in the environment are inaccessible for ground robots due to debris and clutter. The team’s goal is to generate 3D maps that capture the layout of the environment and provide insight into the degree of damage inside the building.

Results

In collaboration with the Tohuku University at Japan Kumar Lab designed a field experiment that highlighted the need for heterogeneity. Ground robots do not have the same payload limitations as quadrotors and they are therefore able to carry larger sensor payloads maintain tethered communication links and operate for longer periods of time. However quadrotors provide mobility and observational capabilities unavailable to ground robots. So to build a rich 3D representation of the environment they leveraged the advantages of each platform and in doing so mitigated the platform limitations.

Overhead view of a neighbor and the path of the vehicle plotted

During the experiment three different research platforms were used. The first platform is a ground robot equipped with an onboard sensing suite that enables the generation of dense 3D maps. The vehicle is teleoperated through the multi-floor environment while simultaneously collecting sensor data. After the operators identify locations in the environment that are inaccessible to the ground platform a second ground platform equipped with an automated helipad is teleoperated to these locations and carries a quadrotor robot equipped with onboard sensing that is able to remotely open and close the helipad and autonomously take off and land from the helipad.

On-site they realized that in complex environments like the earthquake-damaged buildings in Sendai the appearance of the environment drastically changed from its original structure. Our earlier aerial navigation approach that utilized certain assumptions that are specific to man-made indoor environments would have failed and so they designed a new quadrotor platform equipped with an IMU Hokuyo laser scanner (provided by AStuff) stereo cameras pressure altimeter magnetometer and GPS receiver. The aim was to utilize the information from multiple sensors to ensure that even if a subset of the sensors were to fail the performance of the overall system would not be seriously compromised.

They proposed a novel modular and extensible approach to integrate noisy measurements from multiple heterogeneous sensors that yield either absolute or relative observations at different and varying time intervals and to provide smooth and globally consistent estimates of position in real time for an autonomous flight. Through large-scale indoor and outdoor autonomous flight experiments they demonstrated that the fusion of measurements from multiple sensors increases the system robustness.

Outcomes

In the collaborative multi-floor mapping environment the team successfully deployed its ground and aerial robot platforms. They generated maps from both aerial and ground vehicles. Post processing was performed to merge multiple partial maps and create a complete representation of the environment.

Various images from the vehicle's onboard camera

Images from the onboard camera (Figs. 4(a)- 4(d)) and an external camera (Figs. 4(e)- 4(h)). Note the vast variety of environments including open space trees complex building structures and indoor environments. We highlight the position of the MAV with a red circle.

For their new multi-sensor quadrotor platform they did challenging testing in an industrial complex. The testing site spanned across a variety of environments including outdoor open spaces densely filled trees cluttered building areas and indoor environments. The MAV is autonomously controlled using the onboard state estimates. The total flight time was approximately 8 minutes and the vehicle traveled 445 meters with an average speed of 1.5 m/s. As shown in the map-aligned trajectory during the experiment frequent sensor failures occurred indicating the necessity of multi-sensor fusion. The global x y and yaw error is bounded by GPS measurement without which the error would have grown unboundedly. This matches the observability analysis results. It should be noted that the error on body frame velocity does not grow regardless of the availability of GPS.

Future plans

They are planning to utilize their new multi-sensor quadrotor platform for robust navigation mapping and inspection in a variety of indoor and outdoor environments including tunnel-like environments densely cluttered industrial buildings and other critical infrastructures. This opens up applications in a wider domain which includes inspection of both interior and exterior cracks for a dam.