Controlling Drones with the Human Brain

Panagiotis Artemiadis, professor of mechanical and aerospace engineering at Arizona State University, with two drones in his Human-Oriented Robotics and Control (HORC) Lab at the ASU Tempe campus. (Credit: ASU)

A researcher from Arizona State University wants to command machines with the human brain. In fact, within ten years, Panagiotis Artemiadis, professor of mechanical and aerospace engineering at Arizona State University, envisions a swarm of brain-controlled drones playing a critical role in a range of applications, including agriculture and search-and-rescue operations.

The essential technology component of mind-controlled UAVs, or unmanned aerial vehicles, is the interface: a combination of hardware and algorithms that maps one’s brain activation to commands for a robotic system. Using electrodes placed on the scalp to record electrical activity, the non-invasive Electroencephalography (EEG) method measures voltage fluctuations resulting from the neurons’ ionic current. The hardware then measures brain activation as the subject thinks about an intended motion for the machine. The algorithms decode those activations to control commands for the robotic system.

At ASU’s Human-Oriented Robotics and Control (HORC) Lab, Artemiadis and his fellow researchers have evaluated the brain's capacity to direct non-human behaviors, including the flight of drones. In 2016, Artemiadis tested the ability of a single operator to control three UAVs, guiding each through a narrow loop. The subject used a hybrid brain-machine interface, combining EEG-recorded brain activity with input from a joystick. To control the multiple drones, the user had to think about shrinking or expanding the shape of the UAV grouping as a whole. By developing algorithms that extracted specific information from the brain, the experiment demonstrated the ability for humans to steer not just a single machine, but a swarm.

Collective behaviors are abundant in nature, but the idea of a flock or herd has only recently been adopted to robotic concepts. A robotic “swarm” system consists of a large group of interchangeable vehicles that use information, obtained via local sensing and communication, to execute autonomous decisions. Artemiadis and his team tested a “human-swarm interface” to, in effect, decode the brain and extract information related to desired collective behaviors, like the expansion of a swarm’s coverage or the implementation of a specific formation.

The algorithms developed at the HORC lab extract brain signals. The ASU team applies machine learning and pattern recognition algorithms to translate the brain’s electrical signals into control commands that are wirelessly transmitted to the drones.

The integration of very large teams of robots into comprehensive systems enables new tasks and missions, including search, exploration, and surveillance, according to the ASU researcher. Armed with infrared imaging equipment, a drone swarm, for example, could provide real-time tracking of a forest fire, allowing responders to adjust their plans on the fly. Teams of drones could also be called on to create topographic maps for soil analysis and irrigation planning.

Source