Researchers at ETH Zürich and the University of Bologna have recently created PULP Dronet, a 27-gram nano-size unmanned aerial vehicle (UAV) with a deep learning-based visual navigation engine.
Their mini-drone, presented in a paper pre-published on arXiv, can run aboard an end-to-end, closed-loop visual pipeline for autonomous navigation powered by a state-of-the-art deep learning algorithm.
“It is now six years that ETH Zürich and the University of Bologna are fully engaged in a joint-effort project: the parallel ultra-low power platform (PULP),” Daniele Palossi, Francesco Conti and Prof. Luca Benini, the three researchers who carried out the study, who work at a lab led by Prof. Benini, told TechXplore via email.
“Our mission is to develop an open source, highly scalable hardware and software platform to enable energy-efficient computation where the power envelope is of only a few milliwatts, such as sensor nodes for the Internet of Things and miniature robots such as nano-drones of a few tens of grams in weight.”
In large and average-size drones, the available power budget and payload enables the exploitation of high-end powerful computational devices, such as those developed by Intel, Nvidia, Qualcomm, etc.
These devices are not a feasible option for miniature robots, which are limited by their size and consequent power restrictions.
To overcome these limitations, the team decided to take inspiration from nature, specifically from insects.
“In nature, tiny flying animals such as insects can perform very complex tasks while consuming only a tiny amount of energy in sensing the environment and thinking,” Palossi, Conti and Benini explained. “We wanted to exploit our energy-efficient computing technology to essentially replicate this feature.”
To replicate the energy-saving mechanisms observed in insects, the researchers initially worked on integrating high-level artificial intelligence in the ultra-tiny power envelope of a nano-drone.
This proved quite challenging, as they had to meet its energy constraints and stringent real-time computational requirements.
The key goal of the researchers was to achieve very high-performance with very little power.
“Our visual navigation engine is composed of a hardware and a software soul,” Palossi, Conti and Benini said.
“The former is embodied by the parallel, ultra-low power paradigm, and the former by a the DroNet Convolutional Neural Network (CNN), previously developed by the Robotics and Perception Group from the University of Zürich for ‘resource-unconstrained’ big drones, that we adapted to meet energy and performance requirements.”
The navigation system takes a camera frame and processes it with a state-of-the-art CNN. Subsequently, it decides how to correct the drone’s attitude so that it is positioned in the center of the current scene.
The same CNN also identifies obstacles, stopping the drone if it senses an imminent threat.
“Basically, our PULP Dronet can follow a street lane (or something that resembles it, e.g. a corridor), avoiding collisions and braking in case of unexpected obstacles,” the researchers said.
“The real leap provided by our system compared to past pocket-sized flying robots is that all operations necessary to achieve autonomous navigation are executed directly onboard, without any need of a human operator, nor ad-hoc infrastructure (e.g. external cameras or signals) and in particular, without any remote base station used for the computation (e.g., remote laptop).”
In a series of field experiments, the researchers demonstrated that their system is highly responsive and can prevent collisions with unexpected dynamic obstacles up to a flight speed of 1.5 m/s.
They also found that their visual navigation engine is capable of fully autonomous indoor navigation on a 113m previously unseen path.
The study carried out by Palossi and his colleagues introduces an effective method that integrates an unprecedented level of intelligence in devices with very strict power constraints.
This is in itself quite impressive, as enabling autonomous navigation in a pocket-size drone is extremely challenging and has rarely been achieved before.
“In contrast to a traditional embedded edge node, here, we are constrained not only by the available energy and power budget to perform the calculation, but we are also subject to a performance constraint,” the researchers explained.
“In other words, if the CNN ran too slowly, the drone would not be able to react in time, preventing a collision or turning at the right moment.”
The tiny drone developed by Palossi and his colleagues could have numerous immediate applications.
For instance, a swarm of PULP-Dronets could help to inspect collapsed buildings after an earthquake, reaching places that are inaccessible to human rescuers in shorter periods of times, thus without putting the lives of operators at risk.
“Every scenario where people would benefit from a small, agile, and intelligent computational node is now closer, spanning from animal protection to elderly/child assistance, inspection of crops and vineyards, exploration of dangerous areas, rescue missions and many more,” the researchers said. “We hope our research will improve the quality of life of everyone.”
According to Palossi and his colleagues, their recent study is merely a first step towards enabling truly ‘biological-level’ onboard intelligence and there are still several challenges to overcome.
In their future work, they plan to address some of these challenges by improving the reliability and intelligence of the onboard navigation engine; targeting new sensors, more sophisticated capabilities and better performance-per-watt.
The researchers publicly released all their code, datasets and training networks, which could also inspire other research teams to develop similar systems based on their technology.
“In the long run, our goal is to achieve results similar to what we presented here on a pico-size flying robot (a few grams in weight, with the dimension of a dragonfly),” the researchers added.
“We believe that creating a strong and solid community of researchers and enthusiasts hinged on our vision will be fundamental to reach this ultimate goal. For this reason, we made all our code and hardware designs available as open-source for everyone.”
From… official presentation….
Hi everyone, here at the Integrated and System Laboratory of the ETH Zürich, we have been working on an exciting project: PULP-DroNet.
Our vision is to enable artificial intelligence-based autonomous navigation on small size flying robots, like the Crazyflie 2.0 (CF) nano-drone.
In this post, we will give you the basic ideas to make the CF able to fly fully autonomously, relying only on onboard computational resources, that means no human operator, no ad-hoc external signals, and no remote base-station!
Our prototype can follow a street or a corridor and at the same time avoid collisions with unexpected obstacles even when flying at high speed.
ULP-DroNet is based on the Parallel Ultra Low Power (PULP) project envisioned by the ETH Zürich and the University of Bologna.
In the PULP project, we aim to develop an open-source, scalable hardware and software platform to enable energy-efficient complex computation where the available power envelope is of only a few milliwatts, such as advanced Internet-of-Things nodes, smart sensors — and of course, nano-UAVs. In particular, we address the computational demands of applications that require flexible and advanced processing of data streams generated by sensors such as cameras, which is beyond the capabilities of typical microcontrollers. The PULP project has its roots on the RISC-V instruction set architecture, an innovative academic and research open-source architecture alternative to ARM.
The first step to make the CF autonomous was the design and development of what we called the PULP-Shield, a small form factor pluggable deck for the CF, featuring two off-chip memories (Flash and RAM), a QVGA ultra-low-power grey-scale camera and the PULP GAP8 System-on-Chip (SoC). The GAP8, produced by GreenWaves Technologies, is the first commercially available embodiment of our PULP vision. This SoC features nine general purpose RISC-V-based cores organised in an on-chip microcontroller (1 core, called Fabric Ctrl) and a cluster accelerator of 8 cores, with 64 kB of local L1 memory accessible at high bandwidth from the cluster cores. The SoC also hosts 512kB of L2 memory.
Then, we selected as the algorithmic heart of our autonomous navigation engine an advanced artificial intelligence algorithm based on DroNet, a Convolutional Neural Network (CNN) that was originally developed by our friends at the Robotic and Perception Group(RPG) of the University of Zürich.
To enable the execution of DroNet on our resource-constrained system, we developed a complete methodology to map computationally-intense deep neural networks on the PULP-Shield and the GAP8 SoC.
The network outputs two pieces of information, a probability of collision and a steering angle that are translated in dynamic information used to control the drone: respectively, forward velocity and angular yaw rate. The layout of the network is the following:
Therefore, our mission was to deploy all the required computation onboard our PULP-Shield mounted on the CF, enabling fully autonomous navigation.
To put the problem into perspective, in the original work by the RPG, the DroNet CNN enabled autonomous navigation of big-size drones (e.g., the Bebop Parrot).
In the original use case, the computational power and memory was not a problem thanks to the streaming of images to a remote base-station, typically a laptop consuming 30-100 Watt or more.
So our mission required running a similar workload within 1/1000 of the original power.
To make this work, we combined fixed-point arithmetic (instead of “traditional” floating point), some minimal modification to the original topology, and optimised memory and computation usage.
This allowed us to squeeze DroNet in the ultra-small power budget available onboard. Our most energy-efficient configuration delivers 6 frames-per-second (fps) within only 64 mW (including all the electronics on the PULP-Shield), and when we push the PULP platform to its limit, we achieve an impressive 18 fps within just 3.5% of the total CF’s power envelope — the original DroNet was running at 20 fps on an Intel i7.
Do you want to check for yourself? All our hardware and software designs, including our code, schematics, datasets, and trained networks have been released and made available for everyone as open source and open hardware on Github.
We look forward to other enthusiasts contributions both in hardware enhancement, as well as software (e.g., smarter networks) to create a great community of people interested in working together on smart nano-drones.
Last but not least, the piece of information you all were waiting. Yes, soon Bitcraze will allow you to enjoy of our PULP-shield, actually, even better, you will play with its evolution! Stay tuned as more information about the “code-name” AI-deck will be released in upcoming posts :-).
If you want to know more about our work:
- preprint arXiv article
- PULP-DroNet release on Github
- PULP Platform home page
- PULP Platform Youtube channel
Questions? Drop us an email (dpalossi at iis.ee.ethz.ch and fconti at iis.ee.ethz.ch)
More information: An open source and open hardware deep learning-powered visual navigation engine for autonomous nano-UAVs. arXiv:1905.04166 [cs.RO]. arxiv.org/abs/1905.04166
github.com/pulp-platform/pulp-dronet