CHRIS QUIRK

DROPSphere

Last year massive heat blasted global coral reefs, bleaching and killing an unprecedented amount of coral, reported researchers at Coral Reef Watch, an initiative of the National Oceanic and Atmospheric Administration (NOAA). The intensity of the heat damage caused Coral Reef Watch to add three higher levels to their coral bleaching alert protocol.


A bellwether species for planetary harm from climate change, coral reefs make up a small part of the marine environment, but have an outsized impact on oceanic wellbeing. Despite covering a minuscule area of the sea floor, coral reefs support almost a quarter of all marine species in one way or another. The reefs also contribute to the global economy, generating a jaw-dropping $9.8 trillion annually in ecological, economic and societal benefits, according to studies cited by Coral Reef Watch.

Given their importance to planetary ecology, the National Centers for Coastal Ocean Science and the Global Coral Reef Monitoring Network place monitoring coral reefs among their highest research goals. Unfortunately, many deep-sea coral reefs are at depths far, far beyond the safe range for human divers. But Matthew Johnson-Roberson, professor and director of the Robotics Institute (RI), and Tianyi Zhang, a Ph.D. student in RI, had a big idea they squeezed into a small yellow package — the Deep Robot Optical Perception Sphere, or DROPSphere — RI’s own yellow submarine.

DROPSphere — an autonomous submersible about the size of a hand truck Johnson-Roberson and Zhang designed and built to explore and record images of remote underwater locations and seabeds — turns out to be the perfect tool for getting down to deep-sea coral. “Underwater robotics is ideal for modeling marine environments, studying benthic [seabed] environments, and investigating cultural heritage such as shipwrecks,” said Johnson-Roberson. “It is also a possibility for social good because it can track environmental impacts on things like coral reefs and help to mitigate the environmental impact and harm that humans are having on the planet.”

Past underwater autonomous vehicles (UAVs) have not been readily available to researchers. “One of the big things I’ve seen in my career is that they are traditionally very, very expensive,” said Johnson-Roberson. With a price tag often in the $2 million range, they are used primarily for oil and gas work. “Science, tragically, is getting a short shrift.”

(top) Matthew Johnson-Roberson, Professor and Director of the Robotics Institute and (bottom) Tianyi Zhang, Ph.D. student in RI

Robotic submarine-DROPSphere

DROPSphere, the hand-truck sized underwater autonomous vehicle built by Matthew Johnson-Roberson and Tianyi Zhang.

Johnson-Roberson and Zhang decided to solve the money problem by creating a UAV of their own, made with off-the-shelf parts, that could fill a gaping research need. NOAA’s Ocean Exploration Division provided funding for the project through the office’s competitive grant program. “Exploring and surveying deep-sea habitats is expensive and requires a lot of resources,” said Ashley Marranzino of the NOAA Ocean Exploration, Science & Technology Division. “The project offers an opportunity to make this science more accessible to a larger community by providing a less expensive alternative to the large UAVs and remotely operated vehicles that most of the community relies on currently.” Another important aspect of the project will be the development software to help speed up the analysis of large video and imagery datasets generated by the surveys. “If successful, this project could open the door to a new group of deep-sea scientists,” said Marranzino, “allowing them to conduct deep-water surveys from smaller vessels at a fraction of the cost.”

To build the DROPSphere, Johnson-Roberson and Zhang started with a $500 Edge computer — which can run deep learning networks — added some deep-sea foam (which is buoyant but can also withstand high water pressure), a camera, inertial measurement units to track navigation, and a glass sphere to hold all the cameras and more delicate electronics. Finally they installed drone propellers to maneuver and steer DROPSphere. Total cost, around $30,000.

With a target depth of 4,000 meters, DROPSphere puts some of the deepest coral reefs off the coast of Hawaii within range. “The deep coral reefs are a super important part of the ecosystem in a variety of marine environments. And they’re less studied than the shallow coral reefs for obvious reasons, because they’re difficult to get to,” said Johnson-Roberson.

Zhang has been managing the field tests on some shallower coral reefs in the Florida Keys, executing the first dives in the summer of 2023. “Deploying an underwater robot is special,” said Zhang. “On campus, I can do 20 experiments on a robot in a day. Working with a UAV is much more difficult, and we have a five-day window each year to show that the unit works in the field. It’s high pressure.” After daytime deployments, Zhang spent most evenings at an Airbnb, which he converted into a workshop. “We would do electronics repair, validate the hardware, check the software at every level, the whole pipeline.” Johnson-Roberson likened the operation to a wedding. “You charter a boat, you do a lot of planning, you have all the other scientists there, and a lot has to go right to have a successful field season, which I think we did.”

We want researchers to feel like they can walk around the reef and see it clearly from multiple angles.
— Matthew Johnson-Roberson

Given the proposed depth of deployment of DROPSphere, the autonomy of the unit is critical. Johnson-Roberson and Zhang programmed the UAV to run a back-and-forth, lawnmower pattern over reef areas. The cameras and inertial measurement units calibrate the DROPSphere’s trajectories for navigation purposes as it takes detailed pictures of the reef from diverse angles. The duo wrote code that stitches together virtual models of the reef from the source images, providing rich information on texture and color. The models contain enough detail for researchers to rotate the view of the reef around 360 degrees. “We want researchers to feel like they can walk around the reef and see it clearly from multiple angles,” explained Johnson-Roberson.

The color information DROPSphere gathers is particularly prized by oceanographers to better assess coral reef health. However, translating the color data from the source images has posed challenges. “There’s an inherent color distortion underwater since the water absorbs the red areas of the light, and there is also a lot of scattering of light,” Zhang said. To fix this, they created color adjustment tools that “drain the ocean from the reef,” as Johnson-Roberson explained, and present the coral color in the images as it would appear on dry land.

Sierra Landreth, an oceanographer and graduate research assistant at Florida State University, participates in the DROPSphere project. “Current research is focused on characterizing benthic community compositions along seamounts in the Pacific Ocean, and it is often difficult for scientists to gather data in the deep sea,” she said. “Robots are an essential tool for us to get imagery of the seafloor at great depths, and the DROPSphere could assist us greatly in the characterization of underrepresented and under-sampled areas of the ocean floor.”

Despite their remote location, deep-sea coral reefs play an important role in the health of ecosystems all the way up and down the water column, said Johnson-Roberson. “Scientists are really interested in learning more about how those environments grow, but also how we could build artificial reefs to help regenerate some of the harm and damage that humans are doing to our natural world.”   ■