Stanford Engineers Conduct Groundbreaking In-Orbit Test of Satellite 'Swarm' Navigation

An illustration of what satellite swarm autonomous navigation technology looks like in concept. (Image: NASA/Blue Canyon Technologies)

Someday, instead of large, expensive individual space satellites, teams of smaller satellites – known by scientists as a “swarm” – will work in collaboration, enabling greater accuracy, agility, and autonomy. Among the scientists working to make these teams a reality are researchers at Stanford University’s Space Rendezvous Lab, who recently completed the first-ever in-orbit test of a prototype system able to navigate a swarm of satellites using only visual information shared through a wireless network.

“It’s a milestone paper and the culmination of 11 years of effort by my lab, which was founded with this goal of surpassing the current state of the art and practice in distributed autonomy in space,” said Simone D’Amico, associate professor of aeronautics and astronautics and senior author of the study. “Starling is the first demonstration ever made of an autonomous swarm of satellites.”

The test is known as Starling Formation-Flying Optical Experiment, or StarFOX. In it, the team successfully navigated four small satellites working in tandem using only visual information gathered from onboard cameras to calculate their trajectories (or orbits). The researchers presented their findings from the initial StarFOX test at a gathering of swarm satellite experts at the Small Satellite Conference in Logan, Utah.

All the Angles

D’Amico described the challenge as one that has driven his team for more than a decade. “Our team has been advocating for distributed space systems since the lab’s inception. Now it has become mainstream. NASA, the Department of Defense, the U.S. Space Force – all have understood the value of multiple assets in coordination to accomplish objectives which would otherwise be impossible or very difficult to achieve by a single spacecraft,” he said. “Advantages include improved accuracy, coverage, flexibility, robustness, and potentially new objectives not yet imagined.”

The four swarm spacecraft during integration and testing at NASA Ames. (Image: NASA/Dominic Hart)

Robust navigation of the swarm presents a considerable technological challenge. Current systems rely on the Global Navigation Satellite System (GNSS), requiring frequent contact with terrestrial systems. Beyond Earth’s orbit, there is the Deep Space Network, but it is relatively slow and not easily scalable to future endeavors. What’s more, neither system can help satellites avoid what D’Amico calls “non-cooperative objects” like space debris that might knock a satellite out of commission.

The swarm needs a self-contained navigation system that allows a high degree of autonomy and robustness, D’Amico said. Such systems are likewise made more attractive by minimal technical requirements and financial costs of today’s miniaturized cameras and other hardware. The cameras used in the StarFOX test are proven, relatively inexpensive 2D cameras called star-trackers found on any satellite today.

“At its core, angles-only navigation requires no additional hardware even when used on small and inexpensive spacecraft,” D’Amico said. “And exchanging visual information between swarm members provides a new distributed optical navigation capability.”

Written in the Stars

StarFOX combines visual measurements from single cameras mounted on each satellite in a swarm. Similar to a mariner of old navigating the high seas with a sextant, the field of known stars in the background is used as reference to extract bearing angles to the swarming satellites. These angles are then processed onboard through accurate physics-based force models to estimate the position and velocity of the satellites with respect to the orbited planet – in this case, Earth, but the moon, Mars, or other planetary objects would work as well.

StarFOX employs the Space Rendezvous Lab’s angles-only Absolute and Relative Trajectory Measurement System – ARTMS, for short – which integrates three new space robotics algorithms. An Image Processing algorithm detects and tracks multiple targets in images and computes target-bearing angles – the angles at which objects, including space debris, are moving toward or away from each other. The Batch Orbit Determination algorithm then estimates each satellite’s coarse orbit from these angles. Last but not least, the Sequential Orbit Determination algorithm refines swarm trajectories with the processing of new images through time to potentially feed autonomous guidance, control, and collision avoidance algorithms onboard.

Data is shared over an inter-satellite communication link (or wireless network). It is all used to calculate robust absolute and relative position and velocity to a remarkable degree of accuracy without GNSS. Under the most challenging conditions, using just a single observer satellite, StarFOX was able to calculate relative position – the position of individual satellites to one another – to within 0.5% of their distance. When multiple observers were added in, those error rates dropped to just 0.1%.

The Starling test was deemed promising enough that NASA has extended the project, now known as StarFOX+, through 2025 to further explore these improved capabilities and pave the way for future space situational awareness and positioning technologies.

Source