The praying mantis is a notorious insect, known for its calm, motionless stance while waiting for prey to draw near, followed by a swift and powerful strike. Mantises track moving prey using rapid turning head movements to keep their target in the foveal region of the eyes, where the highest resolution images are produced. Though much is known about insect motion and control, exactly how mantises use visual information to produce these sharp head turns remains unclear. What makes this insect especially interesting is that numerous studies have found that once a turn is initiated, it cannot be stopped – meaning that the mantis must predict the future position of its prey. So, what information does the brain send to the nerve clusters controlling locomotion in order to produce these precise movements?

This is the question that an interdisciplinary team of researchers attempted to answer with their mantis-inspired robot, MantisBot. Based on previous studies of praying mantis head movements and insect locomotion in the literature, the team – led by Nicholas Szczecinski from Case Western Reserve University in Ohio, USA – hypothesised that commands to the nerve centres that control locomotion are relatively simple. To test this, they designed a simple nervous network with commands inspired from those biological results, and incorporated it into MantisBot, a 13.3:1 scale robot model of a praying mantis. Based on information provided by the ‘eyes’ – five solar cells arranged around the front of the head – a bioinspired brain-like controller gave simple instructions to the unit controlling the robot's movements: either move or stand still, and which direction to orient the body.

To test how well the proposed network directed the robot, the team shone a 1600 lumen LED at it to simulate prey; the difference in voltage of the solar cells told the robot which direction the prey was in. If the prey moved outside a 20 deg cone from the centre of its vision, the controller told the robot to move its head, and if it moved even further (30 deg outside), the robot also used its body and legs too, just like real praying mantises do. But could it track prey accurately?

MantisBot's control network was a success, allowing the robot to track prey to within 30 deg of its centre of vision, even when moving its legs and body. Switching direction mid-step was no problem either, and the robot successfully used its legs to help rotate its body while standing still. The robot's high accuracy in tracking prey demonstrates that simple commands – which direction, and whether or not to move – are capable of producing tracking behaviour comparable to that of the praying mantis, and provides plausible evidence that the information sent to the animal's own locomotion control centre could be just as simple.

Bioinspired robotic models are not just great precursors to more advanced robots; studies like this demonstrate that they can also supply meaningful feedback, helping us to understand the underlying biology to answer questions that current biological experiments cannot address. Continuing such interdisciplinary work is therefore essential for both fields to progress, and could potentially provide answers to many biological conundrums.

Szczecinski
,
N. S.
,
Getsy
,
A. P.
,
Martin
,
J. P.
,
Ritzmann
,
R. E.
and
Quinn
,
R. D.
(
2017
).
MantisBot is a robotic model of visually guided motion in the praying mantis
.
Arthropod Struct. Dev.