{"id":262,"date":"2024-08-20T08:18:34","date_gmt":"2024-08-20T08:18:34","guid":{"rendered":"https:\/\/neuromorphicrobotics.com\/?p=262"},"modified":"2025-06-30T09:16:46","modified_gmt":"2025-06-30T09:16:46","slug":"how-to-implement-fully-neuromorphic-vision-and-control-for-autonomous-drone-flight","status":"publish","type":"post","link":"https:\/\/braininspiredrobotics.com\/?p=262","title":{"rendered":"How to implement fully neuromorphic vision and control for autonomous drone flight?"},"content":{"rendered":"<p style=\"text-align: justify;\">F. Paredes-Vall\u00e9s, J. J. Hagenaars, J. Dupeyroux, S. Stroobants, Y. Xu, G. C. H. E. de Croon. <a href=\"https:\/\/www.science.org\/doi\/10.1126\/scirobotics.adi0591\"><strong>Fully neuromorphic vision and control for autonomous drone flight<\/strong><\/a>. Science Robotics, 9, eadi0591(2024). DOI:10.1126\/scirobotics.adi0591<\/p>\n<p style=\"text-align: justify;\">Editor\u2019s summary<br \/>\n&#8220;Despite the ability of visual processing enabled by artificial neural networks, <strong><span style=\"color: #ff0000;\">the associated hardware and large power consumption limit deployment on small flying drones<\/span><\/strong>. Neuromorphic hardware offers a promising alternative, but the accompanying spiking neural networks are difficult to train, and the current hardware only supports a limited number of neurons. Paredes-Vall\u00e9s et al. now p<strong><span style=\"color: #ff0000;\">resent a neuromorphic pipeline to control drone flight<\/span><\/strong>. They trained <strong><span style=\"color: #ff0000;\">a five-layer spiking neural network to process the raw inputs from an event camera<\/span><\/strong>. The network first <strong><span style=\"color: #ff0000;\">estimated ego-motion and subsequently determined low-level control commands<\/span><\/strong>. Real-world experiments demonstrated that the drone could control its ego-motion to land, hover, and maneuver sideways, with <strong><span style=\"color: #ff0000;\">minimal power consumption<\/span><\/strong>. &#8220;\u2014Melisa Yashinski<\/p>\n<p style=\"text-align: justify;\">Abstract<br \/>\n&#8220;<strong><span style=\"color: #ff0000;\">Biological sensing and processing is asynchronous and sparse, leading to low-latency and energy-efficient perception and action<\/span><\/strong>. In robotics, neuromorphic hardware for event-based vision and spiking neural networks promises to exhibit similar characteristics. However, <strong><span style=\"color: #ff0000;\">robotic implementations have been limited to basic tasks with low-dimensional sensory inputs and motor actions because of the restricted network size in current embedded neuromorphic processors and the difficulties of training spiking neural networks<\/span><\/strong>. Here, <strong><span style=\"color: #ff0000;\">we present a fully neuromorphic vision-to-control pipeline for controlling a flying drone<\/span><\/strong>. Specifically, we trained a <strong><span style=\"color: #ff0000;\">spiking neural network<\/span><\/strong> that accepts raw <strong><span style=\"color: #ff0000;\">event-based camera<\/span><\/strong> data and outputs <strong><span style=\"color: #ff0000;\">low-level control actions<\/span><\/strong> for performing autonomous vision-based flight. <strong><span style=\"color: #ff0000;\">The vision part of the network<\/span><\/strong>, consisting of five layers and 28,800 neurons, maps incoming raw events to ego-motion estimates and was trained with self-supervised learning on real event data. <strong><span style=\"color: #ff0000;\">The control part<\/span> <\/strong>consists of a single decoding layer and was learned with an evolutionary algorithm in a drone simulator. Robotic experiments show a successful sim-to-real transfer of the fully learned neuromorphic pipeline. The drone could accurately control its ego-motion, allowing for hovering, landing, and maneuvering sideways\u2014even while yawing at the same time. <strong><span style=\"color: #ff0000;\">The neuromorphic pipeline runs on board on Intel\u2019s Loihi neuromorphic processor with an execution frequency of 200 hertz, consuming 0.94 watt of idle power and a mere additional 7 to 12 milliwatts when running the network.<\/span><\/strong> These results illustrate the <strong><span style=\"color: #ff0000;\">potential of neuromorphic sensing and processing for enabling insect-sized intelligent robots<\/span><\/strong>.&#8221;<\/p>\n<p style=\"text-align: justify;\">F. Paredes-Vall\u00e9s, J. J. Hagenaars, J. Dupeyroux, S. Stroobants, Y. Xu, G. C. H. E. de Croon. <a href=\"https:\/\/www.science.org\/doi\/10.1126\/scirobotics.adi0591\"><strong>Fully neuromorphic vision and control for autonomous drone flight<\/strong><\/a>. Science Robotics, 9, eadi0591(2024). DOI:10.1126\/scirobotics.adi0591<\/p>\n","protected":false},"excerpt":{"rendered":"<p>F. Paredes-Vall\u00e9s, J. J. Hagenaars, J. Dupeyroux, S. Stroobants, Y. Xu, G. C. H. E. de Croon. Fully neuromorphic vision and control for autonomous drone flight. Science Robotics, 9, eadi0591(2024). DOI:10.1126\/scirobotics.adi0591 Editor\u2019s summary &#8220;Despite the ability of visual processing enabled by artificial neural networks, the associated hardware and large power consumption limit deployment on small [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[74,8,7],"tags":[84,60,68,61,11,30],"class_list":["post-262","post","type-post","status-publish","format-standard","hentry","category-brain-inspired-robotics","category-neuromorphic-application","category-neuromorphic-robotics","tag-brain-inspired-robotics","tag-drone","tag-intelligent-robots","tag-neuromorphic-control","tag-neuromorphic-robotics","tag-neuromorphic-vision"],"_links":{"self":[{"href":"https:\/\/braininspiredrobotics.com\/index.php?rest_route=\/wp\/v2\/posts\/262","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/braininspiredrobotics.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/braininspiredrobotics.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/braininspiredrobotics.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/braininspiredrobotics.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=262"}],"version-history":[{"count":4,"href":"https:\/\/braininspiredrobotics.com\/index.php?rest_route=\/wp\/v2\/posts\/262\/revisions"}],"predecessor-version":[{"id":383,"href":"https:\/\/braininspiredrobotics.com\/index.php?rest_route=\/wp\/v2\/posts\/262\/revisions\/383"}],"wp:attachment":[{"href":"https:\/\/braininspiredrobotics.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=262"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/braininspiredrobotics.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=262"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/braininspiredrobotics.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=262"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}