Visual perception biomimicry and the future of robotics
(This article was edited on 20 August 2020)
Our future robotic overlords will require better visual systems and navigational skills than robots have now to achieve autonomy. Their foreseen rise to power demands it!
But seriously, if we want to improve robot mobility, we need to study the visual systems of insects.1 Insects process visual information to perform extraordinary feats of navigation despite their small size.1 They have a wide field of view with low image resolution, but it is hyperacuity (sharpness of vision) that allows for such precise movement.2 Use of the sky as a compass assists navigation; using sunlight, patterns of polarised light, the moon, and even starlight!1,3 Indeed, insects provide biological inspiration for the engineering of autonomous robots.1,4 And one of the key challenges in robotics is visual perception implementation.4
Recently, Anna Honkanen and Andrea Adden, of the Lund Vision Research group at Lund University, presented a virtual seminar for Macquarie University. Their research on the visual and navigational systems of the American cockroach and the Australian Bogong moth, respectively, provides further insight into dim-light navigation informing biomimicry.3,5
What is biomimicry?
“the imitation of natural biological designs or processes
in engineering or invention” 6
Arguably the most well-known example of biomimicry application is Velcro.7 George de Mestral, a Swiss engineer, observed the way burs stuck to his dog’s fur and in 1955 patented the hook and loop fastener based on this natural process of seed dispersal (Fig A).7
A more recent example is my personal favourite. The ability of geckos to adhere to and climb smooth (seemingly!) surfaces is amazing. It is because of atomic-level forces (known as Van der Waals forces) and the gecko’s toe hairs (setae) interacting with minute imperfections in the surface (Fig B).8 This biological observation has inspired the invention of self-cleaning surfaces by mimicking these natural processes!8
One important challenge for engineers, who create small, autonomous robots, is miniaturising visual systems needed for superior navigation.4 I include some examples at the end of this article that may spark your imagination of future possibilities.
Insect visual systems research can help with this engineering challenge. Scientists look to nature for solutions!
Night Vision and the American Cockroach
Anna Honkanen researches the visual system in the American cockroach (Fig C).5 She secured cockroaches in a Virtual Reality experimental set up (Fig D) and measured how they orient themselves in whole-field vision (optomotor response).5 Intracellular recordings measured behavioural thresholds at different light levels.5 Basically, the cockroaches had pins stuck in them and electrical impulses analysed. Sure, we all know the cockroaches run faster in brighter light, but this study found that starlight was sufficient for navigation.5 The cockroach has extraordinary night-vision!
Navigational System of the Bogong Moth
The visual system is more than just the eyes, it includes neural circuitry, with a massive amount of resources in the brain allocated to environment perception.4 Andrea Adden took intracellular recordings in the Bogong moth’s brain (Fig E), to research navigational systems.3 This moth also exhibits excellent night-vision, especially during their annual migration at night.3 The study examined cues of moonlight, starlight, landmarks, and the geomagnetic field.3 The influence of the moon remains unknown, while moths had a weak response to the weak geomagnetic field cue.3 What the study showed is the Milky Way as the main compass for the Bogong moth, calibrated by local landmarks.3
TOP 5 Take-Away Messages
- Night-vision is exceptional in American cockroaches and Bogong moths
- Starlight is sufficient for navigation in these study species
- The visual system is more than the eyes—it includes neural circuitry
- Environment perception systems are a challenge for robotics
- Visual and Navigational systems research assist biomimicry applications
Future of Robotics
Why do we need robots with better visual perception and navigational skills?
Some reasons include;
- “Precision Farming”
As the world’s population increases, there will be a greater need for more efficient agricultural practices to increase food resources without sacrificing sustainability.9,10 Small autonomous flying robots ideally suited to this work require superior environment perception capabilities.9
- Search and Rescue
Miniature robots with good visual and navigational capabilities can become first responders, accessing dangerous or difficult to access places for search and rescue assessment.11
- Exploration of other planets
Improved robot manoeuvrability because of biomimicry research would assist in space exploration and deployment of scientific instruments on other planets.12
Biomimicry may not deliver on sustainability if research focus remains solely on imitating a handful of biological traits of a few organisms.7 We need complete visual system research to understand perception capabilities and realise the potential for robotics.4 Through interdisciplinary research linking environmental and business interests, we can transform technologies of the 21st century.7
- Srinivasan MV. Visual control of navigation in insects and its relevance for robotics. Current Opinion in Neurobiology. 2011;21(4):535–43. https://doi.org/10.1016/j.conb.2011.05.020
- Bogue R (a). Developments in biomimetic vision. Sensor Review, 2013; 33(1):14–8. https://doi 10.1108/02602281311294306
- Honkanen A, Adden A, da Silva Freitas J & Heinze S. The insect central complex and the neural basis of navigational strategies. The Journal of Experimental Biology, 2019; 222: jeb188854. doi:10.1242/jeb.188854
- Dudek P. Vision. In: Living machines. Oxford University Press, 2018, 14pp. doi:10.1093/oso/9780199674923.003.0014
- Honkanen A, Takalo J, Heimonen, K, Vähäsöyrinki M & Weckström, M. Cockroach optomotor responses below single photon level. The Journal of Experimental Biology, 2014; 217(23):4262–8. doi:10.1242/jeb.112425
- Merriam-Webster. (n.d.). Biomimicry. In Merriam-Webster.com dictionary. Retrieved April 28, 2020, from https://www.merriam-webster.com/dictionary/biomimicry
- Ivanic K-Z, Tadic Z & Omazic M. “Biomimicry – An Overview.” The Holistic Approach to Environment, 2015, 5(1):19-36.
- Xu Q, Zhang W, Dong C, Sreeprasad TS & Xia Z. Biomimetic self-cleaning surfaces: synthesis, mechanism and applications. Journal of the Royal Society, Interface, 2016;13(122). doi.org/10.1098/rsif.2016.0300
- Kumar V. The future of flying robots. Video recording, TED, 2015, viewed 28 April 2020. https://www.youtube.com/watch?v=ge3–1hOm1s
- Bogue R (b). Can robots help to feed the world? Industrial Robot: An International Journal, 2013, 40(1): 4-9.
- McQuate S. ‘The first wireless flying robotic insect takes off.’ University of Washington News, Washington, 15 May 2008, Available at http://www.washington.edu/news/2018/05/15/robofly/
- Richter L, Schilling K, Bernasconi MC & Garcia-Marirrodriga C. Mobile Micro-Robots for scientific instrument deployment on planets. Robotics and Autonomous Systems, 1998, 23:107-115.