Zum Hauptinhalt springen
TU Graz/ TU Graz/ Services/ News+Stories/

When people and drones merge

07/17/2018 | Planet research | FoE Information, Communication & Computing | Young Talents

By Birgit Baustädter

PhD students Okan Erat and Alexander Isop managed to implement a system at TU Graz’s droneSpace which controls drones using body movements and allows the field of view of man and machine to merge.

Alexander Isop (second from left) and Okan Erat (second from right) with their supervisors Denis Kalkofen (left) and Dieter Schmalstieg (right). © Baustädter – TU Graz
It sounds a bit like a futuristic cyborg – what TU Graz PhD candidates Okan Erat and Alexander Isop are developing: combining the human body with a drone. In their research work „Drone-Augmented Human Vision: Exocentric Control for Drones Exploring Hidden Areas“, which has just been published in the Journal IEEE Transactions on Visualization and Computer Graphics, they present an associated concept by which they hope to make rescue missions simpler and less dangerous for human rescuers. ‘Rescuers often have to take high risks when, for instance, they don’t know the danger zones in a collapsed building,’ explains Okan Erat.

The publication „Drone-Augmented Human Vision: Exocentric Control for Drones Exploring Hidden Areas“ was recently published in the Journal „IEEE Transactions on Visualization and Computer Graphic“.

Non-dangerous missions in dangerous situations

The newly developed system envisages first sending drones into the danger area to reconnoitre the situation. ‘This is of course not new. But our concept goes a step further. Navigating a drone using a joystick or similar in a narrow space riddled with obstacles is very difficult and requires quite a lot of skill. On top of this, it’s not even possible to control a drone which is out of the range of vision of the controller,’ explains Okan Erat. To solve the vision problem, footage from the drone’s camera is used to make a virtual image of the actual environment in which the user can immerse herself or himself by means of virtual reality glasses – in this case, a standard commercial HoloLens.
Im Vordergrund eine große schwarze Brille mit durchsichtigen Gläsern, im Hintergrund eine Drohne, also ein schwarzes Gestell mit roten Propellern
The experimental set-up: a HoloLens in the foreground with the specially built drone in the background.
The HoloLens also makes it possible for the respective head position of the user to be transmitted to the drone, which can in turn flexibly adapt to the direction of view. What’s more, the drone can be controlled using body movements in this set-up – as if the user’s own body was merged with the drone. “This is the most natural way of controlling a drone, the easiest to learn and the fastest to put into practice. ‘The drone doesn’t just send a video image back which first has to be looked at and analysed on a computer. The experts can immerse themselves directly in the environment by means of the headset and steer the drone through it intuitively.

The two young researchers show how the drone system works in TU Graz’s ‘drone space’ in a demonstration video on YouTube.

Virtual or mixed reality

In their lab experiments, the researchers blocked out the entire field of vision of the test subject, thus completely immersing her/him in a virtual image of the world. Another possibility would be for users to superimpose additional information over reality in a semi-transparent way. ‘This would be as if I could look through a wall – like an X-ray image,’ explained Okan Erat. A conceivable scenario would be as follows: A wall has to be broken through in order to gain access to trapped persons. It is not clear to the rescuers whether there are dangerous tanks or power lines on the other side of the wall. With help from the drone images, precisely these potential hazards could be projected directly into the visual field of the drillers by means of a headset. The rescuers would then know where exactly a passage could be made safely. The drone was developed at TU Graz in the ‘droneSpace’ – a drone lab in Graz’s Inffeldgasse in the framework of the ‘UFO – Users Flying Organizer’ project, which grapples with autonomous drone-aviation systems for indoor areas. ‘The drone was specially conceived for flying in narrow interior spaces and for interaction between human and machine. For this reason it’s very light, the propellers are made of soft material and the engines have comparatively low power,’ explains Alexander Isop, who designed the prototype and developed the framework for the autonomous aviation system. Together with Friedrich Fraundorfer, he offers courses in "droneSpace" for interested students.
Ein schwarzes Drohnengestell, mit roten Propellern, einem blauen Ventilator und einer grünen Platte mit Schaltungen.
The drone was specially developed in the drone Space of TU Graz.

Visions of the future 

The drone currently used for research is very big; the visions for the future, however, are small. Very small, in fact: ‘I can imagine, for instance, that at some time drones will be as small as flies. They could then be sent out in swarms and broadcast many more extensive images of the danger zone. And they could work at very different places at the same time,’ explains Erat. This, too, is a new development of the TU Graz researchers. The user could simply switch between several drones and thus switch between different angles of view, for instance, going into different rooms very fast in order to gain an overall view of the situation. There is also plenty of room for new ideas when controlling the drone: "Development is heading in the direction that the user should be able to give ever simpler and more natural control commands, such as voice commands or gestures. The drone, on the other hand, must be able to solve increasingly complex tasks on its own. It is also important that the user always knows the status of the drone - so to speak, what the drone is thinking. This interaction between man and machine will continue to be a very interesting topic for me in the future," Isop adds. One example is that the drone can use a simple voice command to navigate through narrow passages or search for and recognize trapped persons. The visualisation also needs to be further developed: ‘I’m now going to concentrate on the visualisation and contextualisation. At the moment it’s only possible to visualise the actual display detail. But I’d like to create a kind of memory that can allow more space to be recognised – at least for a certain amount of time – so it can be reconstructed,’ explains Erat. But that’s just the first step: ‘Ultimately, we want to make the drone as smart as possible. Ultimately, it should know itself what the user wants from it. That’s the big picture,’ says the scientist, smiling.

This research project is attributed to the Field of Expertise „Information, Communication & Computing“, one of TU Graz' five strategic areas of research.
Visit Planet research or more research related news.

Information

Okan Erat stellte seine Forschung auch in einem Beitrag auf dem AirCampus Graz, dem Postcastservice der vier Grazer Universitäten, vor. Den Beitrag finden Sie in englischer Sprache auf der Website des AirCampus Graz.

Contact

Okan ERAT
M.Sc.
Institute of Computer Graphics and Vision
Inffeldgasse 16/II | 8010 Graz
Phone: +43 316 873 5032
okan.erat@icg.tugraz.at Alexander ISOP
BSc MSc
Institute of Computer Graphics and Vision
Inffeldgasse 16/II | 8010 Graz
Phone: +43 316 873 5055
isop@icg.tugraz.at