News

Towards The Internet of Senses

The Internet of Senses (IoS) aims at providing comprehensive multisensory experiences that are nearly inseparable from reality and improve intelligent human-machine interaction.

This joint project between Aalto University and TII (Technology Innovation Institute), Abu Dhabi UAE, presents a human-machine interaction testbed for 6 DoF real-time immersion with haptic feedback towards realizing the Internet of Senses. In this project, we have worked on designing a real-life system that allows the remote control of a UAV for maximum immersion (6 Degrees of Freedom) while allowing optimal and reliable control of UAVs.
Internet of Senses

 The testbed included: 

 

 

Remote UAV

The remote UAV is equipped with an embedded computer, IoT sensors,  360◦ camera and a 5G modem. 

VR users

Upon receiving the 360◦ video stream, the user views the real-time stream through a Web platform and controls the UAV remotely using his body movements on the VR Treadmill platform.

Edge Server

The Edge server comprises the streaming module, control and monitoring module and the web server. 

Streaming module

This application is composed of an optimal that allows transmitting the real-time video stream to the web application with the lowest latency possible. It provides 360◦ video to any device able to access the web. 

Web server

The web server serves the WebVR application. It is the interface to the user to view the information status of the UAV and 360◦ video stream through an HTML5 video player adapted to play 360◦ video. It manages the video stream from the streaming module and synchronizes different video inputs (UAV video streams) with the outputs (video players who are requesting a given stream). The 360◦ video can be viewed by any device able to access a web browser. The choice of WebVR was mainly due to allowing an immersive view to any device that has access to a web browser starting from a simple card box to a HMD. 

Control and monitoring module

It is in charge of two functions: i) forwarding the user control commands from the web application to the flight controller module, and ii) updating the user about the censorial information of the UAV such as altitude, latitude, longitude and speed, as well as LTE and 5G-relevant information from the modem that is connected to the UAV. This information is integrated within the 360◦ immersive view of the HMD. It is visualized by clicking on virtual elements within the immersive view

VR Treadmill

The treadmill is used to track the user movements (walking speed, heading, height) and translates them into drone maneuvers. We implemented an algorithm that gets the different sensors measurements from the treadmill platform and translates them into drone commands. We also exploited the drone’s sensors feedback to implement the so called haptic feedback. The implemented model takes into consideration the different drones’ movements and translates them into vibrating frequencies that are sensed by the VR User while flying the drone. Different maneuvers will lead to different torques at the copter motors level which would be translated into different vibration amplitudes at the treadmill level by the implemented algorithm. The implemented haptic feedback algorithm maps one to one the vibrations frequencies into different frequencies and amplitudes to give to the user different senses for different maneuvers.

A demo has been done at GITEX, the world’s largest tech event in Dubai. We had a user controlling a drone using his body movements based on the 4K 360 video feedback from Dubai while the drone was in Abudhabi which makes it around 100km with an end-to-end video delay <500ms.

We also tested the setting from Aalto University while the drone was in TII, Abudhabi UAE where a user would control the drone using VR joysticks and we had an end-to-end delay of 900ms.

For more information, please contact Nassim Sehad.

  • Published:
  • Updated:

Read more news

Aerial view of modern urban buildings with green rooftops and solar panels in a dense cityscape by the water.
Cooperation, Research & Art Published:

Aalto University secures significant funding for critical green transition research

Granted by the Research Council of Finland, the funding will accelerate research in energy systems, microelectronics and sustainable cities
A hand in a blue glove holding a spherical glass flask with a cork, containing a brown, grainy substance.
Research & Art Published:

A Flexible Biorefinery using Machine Learning

Biorefineries convert biomass, such as wood, annual plants or agricultural into products and energy. Research teams in Finland and Germany aim to maximize such product output for a more holistic valorization of our natural resources. The development of these new processes is often slow because they require optimization of many factors. The integration of artificial intelligence (AI) can help us accelerate such a development drastically.
Blue-coral gradient background with a podium floating over balloons and above it a white cloud dispersing a ray of light to its spectral components.
Research & Art Published:

The winner of Aalto University's Open Science Award 2024 is AALTOLAB Virtual Laboratories

The winner of Aalto University's Open Science Award winner for 2024 has been chosen.
Algorithms and theoretical computer science, illustration Matti Ahlgren/Aalto University
Research & Art Published:

Aalto computer scientists in SODA2025 and SOSA2025

Department of Computer Science papers accepted to the ACM-SIAM Symposium on Discrete Algorithms (SODA2025) and SIAM Symposium on Simplicity in Algorithms (SOSA25).