Bees are extremely intelligent insects with the ability to communicate with the rest of their swarm to orchestrate their collective movement in complex environments. CUHK’s engineering team has recently built an artificial intelligence (AI) navigation system that can make millions of microrobots behave like a bee swarm, autonomously reconfiguring their motion and distribution according to environmental changes, such as going around obstacles inside a human body. The findings have been reported in Nature Machine Intelligence, bringing clinical applications of microrobots a step closer.
Microrobots have been proposed as a medium for targeted drug delivery inside the body, in particular narrow and confined spaces or hard-to-reach tissues. Thousands or even millions of microrobots aggregated together are required to perform such tasks due to the limited capacity and functional capabilities of each individual, and the swarm is usually controlled by an external magnet or electromagnet. However, facing complicated and changing environments in the human body, such as fluid with contrasting characteristics, winding tubes and branches, the difficulties in manually manipulating microrobots are huge, and so is the chance of task failure.
Professor Li Zhang from the Department of Mechanical and Automation Engineering said, ‘Collective movement in schools of fish and flocks of birds frequently involve them switching their shapes and structures to adapt to different situations and environments, such as to avoid terrain or obstacles, or attack an enemy. We have yet to find a way to make microrobots intelligent, but we can use an AI controlling system to externally manipulate their collective motion, making sure they do not get lost and stuck inside our bodies.’
With deep learning algorithms integrated with years of research data on microrobot navigation, Professor Zhang’s team, in collaboration with Professor Qi Dou’s group from CUHK’s Department of Computer Science and Engineering, has developed an AI system that realises the automation of the throngs. The system obtains vision from imaging tools such as ultrasound and X-ray fluoroscopy to help identify obstacles inside the body and plan in real-time the best possible route for the delivery of microrobots. It can also control magnets or electromagnets to navigate the swarm and change its formation to increase its success rate in reaching its destination.
The full text of the research paper can be found:
Autonomous environment-adaptive microrobot swarm navigation enabled by deep learning-based real-time distribution planning https://www.nature.com/articles/s42256-022-00482-8