Skip to main content
Intellect

BYU creating new breed of drones that can navigate without GPS

Research has military and package delivery implications

drone535.jpg

Whether it’s because of jammed signals or the need to navigate caves and underground networks, drones that rely only on GPS often become worthless in war zones.

Closer to home, the budding markets of drone package delivery and infrastructure inspection need to be able to operate even when GPS signals drop or become weak (known as degradation) as UAVS fly in close proximity to buildings and obstacles.

Fortunately, BYU research into “GPS-denied environments” is producing a new breed of fixed-wing UAVs that can navigate effectively when GPS signals are intermittent, interrupted, degraded or altogether nonexistent.

“GPS was and is a great enabler for unmanned aircraft,” said Tim McLain, BYU professor of mechanical engineering. “But GPS is not a perfect sensor and unmanned aircraft have been too dependent on it. Until there is a more robust sensing strategy, we need to find solutions to work around it.”

To that end, McLain and his fellow researchers are using onboard devices — cameras, depth sensors and other types of sensors with sophisticated algorithms — to overcome the dependence on GPS. In a recent IEEE publication, McLain, BYU grad student Gary Ellingson and Kevin Brink with the Air Force Research Lab detail a system that uses only a monocular camera, attached to the front of a fixed-wing UAV, and an inertial measurement device to successfully maneuver a drone with no GPS.

The tech enables the UAV to use a novel methodology called relative navigation. Relative navigation estimates the relative state of a UAV — position, velocity and attitude — so it can navigate relative to local surroundings.

Instead of trying to update an unmanned aircraft’s global position, the method tracks where the drone started, what direction it moved and how far it moved from the original spot. When the original spot moves out of the camera’s field of view, a new origin spot, or keyframe image, is then established, and the process repeats itself over and over to produce visual odometry for the aircraft.

In tests of the method, the UAV flew 1800 meters over the course of two and a half minutes and the team tracked its location successfully with an error of less than 3 percent of the distance traveled. McLain said their solution is critical because many of the current best practices for dealing with GPS-denied environments work pretty well, but don’t work as well as you need them to.

“We are as accurate or more accurate than most methods in estimating the location of the UAV at any point in time, but the most important thing is we really know how accurate we are,” McLain said. “If you think you’re within a millimeter, but you’re only within 20 centimeters, then you might make a disastrous mistake.”

The ongoing research is supported by both military and industry members of the Center for Unmanned Aircraft Systems, the only National Science Foundation-funded unmanned aircraft research center in the country, headquartered at BYU.

Read More From

Related Articles

overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=true overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText= overrideTextAlignment=
overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=true overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText= overrideTextAlignment=
overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=true overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText= overrideTextAlignment=
overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=true overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText=