A team of engineers from Austria want to change the way we interface with drones.
The team from the Graz University of Technology have combined a drone with Microsoft’s augmented reality HoloLens to transform them from complicated to control hovering robots into remote cameras that even an untrained user can intuitively control.
Robotics already offers many opportunities for fantastic remote-sensing systems. They are ideal machines to send into disaster zones on search and rescue missions.
Yet, while robots can quickly navigate large structures and cover a significant amount of ground looking for survivors drone tech still relies on the expertise of their controller.
Up to now actually controlling a drone – especially once it out of site – has been difficult.
The team believe that by mashing up Drone and Virtual Reality tech, they can allow just about anyone to easily navigate using the drone.
How using the drone works
Their technique uses image-rendering techniques to combine live video from the drone with a 3D model of the environment – allowing a user to see the whole environment independently of their point of view.
By linking the drone’s autopilot to the direction of the users gaze the whole ‘experience’ of piloting a drone changes to give the illusion of X-Ray vision.
The HoloLens itself works by projecting a small image into your field of vision. It can track the direction your head is pointing and can then overlay a video feed from the drone on top of what you are already looking at.
The real world effect is that walls suddenly seem transparent – when you change where you are looking the drone uses it auto-pilot to navigate itself, so you get the new view you want.
If you don’t want the drone to move by itself, you can grab the small drone-icon in the Hololens and drag it to where you want the drone to move.
While the experimental drone used an on-board computer, autopilot, forward-facing camera and battery, it did rely on external location markers.
Now the team have proved the concept they are looking to reduce the drones reliance on external systems (used to model the environment and track it) to help solve real-world problems. XRay vision would make a real difference to those searching buildings for survivors after a disaster and has the potential to improve lives
The paper “Drone-Augmented Human Vision: Exocentric Control for Drones Exploring Hidden Areas” by Okan Erat, Werner Alexander Isop, Denis Kalkofen and Dieter Schmalstieg from Graz University of Technology, was presented at IEEE VR 2018 in Roetlingen, Germany.