The extremely nimble submersible robot uses a flexible sensor skin to detect lost and unexploded munitions in places that were formerly almost unreachable for all but the most expert divers
Recovered by Robo Ray
Giant arsenals of unexploded ordinance are sitting on the ocean floor, lost in battle or dumped as waste. The risky job of detecting these underwater hazards is currently given to submarines specially fitted for the purpose. But even they cannot get to some of the tighter or harder to reach spots, forcing expert divers to go down and take over the often life-threatening work. A German research consortium including Fraunhofer IZM is now using a submarine robot that is as nimble and mobile as a manta ray and equipped with innovative connected sensors on its fins to gather more information about its surroundings. One particularly nifty trick: It can measure water pressure so precisely that metal objects can be detected on the ocean floor, even if they are covered by sediment .
Unmanned underwater vehicles or UUVs have been in use for several years, but high-tech pioneers for reliable underwater communication and innovative bionics like EvoLogics GmbH have let themselves be inspired by marine life like manta rays and adapted their look and technical anatomy to the submarine world.
With the enormous “wingspan” of their fins, manta rays are known to cover vast distances, while their extremely flexible vertebrae means that they can make surprisingly sharp turns on their seemingly weightless journey through the sea. Their robotic cousins can be very agile as well, but they were not smart enough yet to replace the professional divers who had to scour the sea floor for hours, looking for lost ordinance from the First or Second World War or other hazardous metal waste before offshore wind farms could be built or intercontinental cables could be put down. Now, the new robo ray will make it possible to detect submarine hazards with a whole battery of sensors.
The “Bionic RoboSkin” project, supported by Germany’s Ministry of Education and Research, is working to give the manta-shaped UUVs a flexible bionic sensor skin to help them navigate their underwater world. The skin is made from a compound fabric that is fitted with sensor elements and water-resistant connectors to supply the sensors with power and transmit their data. Researchers from Fraunhofer IZM have taken on the challenge of developing these integrated sensor modules with which the UUVs can detect touch or the proximity of objects and virtually see and analyze their surroundings. The project consortium is headed by EvoLogics GmbH and includes other experts in the field from TITV Greiz, Sensorik Bayern GmbH, the diving specialists of BALTIC Taucherei- und Bergungsbetrieb Rostock GmbH, and GEO-DV GmbH, all with one mission: To create a new generation of robots that can support their human partners with a range of semi or fully automated services and functions. Their capabilities will not be limited to the sea: The researchers are looking at a second use case for a land-based robot sensor platform, fittingly called “Badger” or “Dachs” in German. It will navigate by GPS and be fitted with ground penetrating radar to detect metal objects below ground or conduct other ground survey work in harder to reach places (including tunneling work).
Under the robotic manta ray’s deceptively lifelike shell lies intricate technology: A permeable and therefore pressure-neutral fabric skin is created and fitted with integrated microelectronics for touch, flow, motion, and position sensors. This textile skin is then pulled tight over the robotic fins, creating a soft robotics machine that can sense its surroundings. The team at Fraunhofer IZM is responsible for the electronics that make this possible: They developed sensor nodes suitable for submersible use that can collect and pre-process the sensor data. These nodes do not only have to be fit for purpose, they also need to be extremely miniaturized to fit underneath the thin fabric skin and integrate the necessary connectors. In active operations below the waterline, these sensors can track parameters like acceleration, pressure, or absorbency. The researchers also included LEDs in the circuit board design that let the robotic manta rays communicate with human divers, for instance to signal a turn.
All of these components and sensor packages are integrated by means of a highly miniaturized embedding method and protected from the cold and wet environment by a robust case. Despite this, the footprint of the embedded modules is amazingly small at 23 x 10.5 x 1.6 mm³, fitting a complete sensor package and microcontroller in something the size of a common door key. The case itself works as a conductor by creating the mechanical and electrical contact with the sensor skin itself. The researchers chose a modular two-part design from their original vision of the product: The embedding module combines the individual electronic components on a millimeter scale for exceptional integration; the module case acts as the mechanical interface with the skin and makes the system as robust as it has to be for its destined purpose. The coupling between module and case relies on a seemingly simple clipping action: Small pins on the connector surface on the skin and tiny hooks on the sensor module itself snap together to form an easily de- and attachable interface. The resulting system is modular to allow easy reconfiguration.
The researchers at Fraunhofer IZM will now subject their robotic manta ray to a series of tests with their project partners. The results and findings from the “Bionic RoboSkin” project will likely be of use for many other projects and contribute to more pressure-neutral and reliable packaging solutions for flexible, mobile, and smarter service robots.
The “Bionic RoboSkin” project is supported through the VDI/VDE-IT by the Ministry of Education and Research (funding code 16ES0914) as part of the federal government’s research and innovation campaign 2016 to 2020 “Microelectronics from Germany – Driver of Innovation for the Digital Economy”.
Last modified: