Inspired by Insects: New Anti-collision Technology Could Help Create Safer Driverless Cars
Swarming insects are the inspiration behind a global research project that aims to create a pioneering new collision avoidance system to enhance the safety of driverless cars. Funded by a €1.8 million grant from the European Union’s Horizon 2020 research and innovation programme, the project will develop a miniature, trustworthy collision detection sensor system that […]
Swarming insects are the inspiration behind a global research project that aims to create a pioneering new collision avoidance system to enhance the safety of driverless cars.
Funded by a €1.8 million grant from the European Union’s Horizon 2020 research and innovation programme, the project will develop a miniature, trustworthy collision detection sensor system that could drastically improve the safety of autonomous vehicles.
Although extensive testing of these vehicles has already begun both on and off road, their safety around other vehicles and unexpected hazards is a key stumbling block in their development. The ULTRACEPT project – which stands for Ultra-layered perception with brain-inspired information processing for vehicle collision avoidance and is led by the University of Lincoln, UK – will develop a new microchip for driverless vehicles which aims to make them safe to serve human society.
Developers have found that the current approaches for vehicle collision detection are largely ineffective in terms of reliability, cost, energy consumption and size: radar is too sensitive to metallic material, GPS-based methods face difficulties in cities with high buildings, vehicle-to-vehicle communication cannot detect pedestrians or any unconnected objects, and normal vision sensors cannot cope with fog, rain or dim light conditions at night. The ULTRACEPT researchers hope to develop a system that will overcome all of these issues.
The new ULTRACEPT sensor will be inspired by the rapid reactions of insects, incorporating near-range collision detection technology, long-range hazard perception, and thermal-based collision detection tools. This will ensure that it works day and night, and can quickly adapt to unexpected hazards and different conditions – for example sudden weather changes or driving in and out of tunnels.
This means that the robust, low-cost, and energy-efficient collision detection and avoidance system will offer a capability which is currently beyond the autonomous vehicles in development. The project brings together experts from universities in the UK, Germany, China, Japan, Malaysia and South America.
Professor Shigang Yue, Professor of Computer Science at the University of Lincoln, is leading the ULTRACEPT project. He said: “Autonomous vehicles, although still in the early stages of development, have demonstrated huge potential for shaping our future lifestyles – from sending children to school, driving commuters to work, delivering packages to households, and distributing goods to warehouses, shops or remote areas. But to be functional on a daily basis there is one critical issue to solve; trustworthy collision detection.
“Biology provides a rich source of inspiration for artificial visual systems for collision detection and avoidance. For example, locusts, with a compact visual brain, can fly for hundreds of miles in dense swarms free of collision; praying mantis can monitor tiny moving prey with the help of specialised visual neurons; and nocturnal insects successfully forage in the forest at night without collision.
“These naturally evolved vision systems provide ideal models to develop an artificial system for collision detection and avoidance, and we hope that in the future, each vehicle, with or without a driver, will be well equipped with an innovative sensor to navigate as effectively as animals do.”
The project will bring together a world-class research team with specialist expertise including experts in hardware and software systems and robotics, invertebrate visual neuroscientists, invertebrate vision modellers, mixed-signal chip designers, robotics platform providers, and brain-inspired pattern recognition.
It builds on Professor Yue’s expertise in developing autonomous navigation of mobile robots based on the locustÿs unique visual system, as well as the work carried out as part of the previous ‘Spatial-Temporal Information Processing for Collision Detection in Dynamic Environments’ (STEP2DYNA) research project, also led by the University of Lincoln.
The University of Lincoln is working with Hamburg University and Newcastle University for STEP2DYNA, plus partners from the University of Buenos Aires in South America, Kyushu University in Japan, and Chinese institutions, Huazhong University of Science and Technology, Xiÿan Jiaotong University and Tsinghua University.
Joining the consortium for ULTRACEPT is the University of Münster, Universiti Putra Malaysia, National University Corporation Tokyo University of Agriculture and Technology, Institute of Automation Chinese Academy of Sciences, Lingnan Normal University, Northwestern Polytechnic University and Guizhou University, plus SMEs Visomorphic Technology in the UK, and German-based Dino Robotics.
Professor Yue added: “Most autonomous vehicles are still legally restricted to their testing arena precisely because developers canÿt be sure of their safety and accidents involving Tesla and Uber cars, with autonomous driving functionality on, have only highlighted this issue further. Collision detection and avoidance is so important for vehicles now and in the future, yet there is no acceptable product currently available on the market to specifically meet this need – that is exactly what we hope to develop.”
The project will also involve building up a comprehensive video database of hazardous driving scenes so that the systems developed can be rigorously tested before being taken out on the road. As well as being used within autonomous vehicles, ULTRACEPTÿs integrated computer vision system will be applicable to a number of other industries, including robotics, video game developments and healthcare.