Towards The Internet of Senses
This joint project between Aalto University and TII (Technology Innovation Institute), Abu Dhabi UAE, presents a human-machine interaction testbed for 6 DoF real-time immersion with haptic feedback towards realizing the Internet of Senses. In this project, we have worked on designing a real-life system that allows the remote control of a UAV for maximum immersion (6 Degrees of Freedom) while allowing optimal and reliable control of UAVs.
The testbed included:
The remote UAV is equipped with an embedded computer, IoT sensors, 360◦ camera and a 5G modem.
Upon receiving the 360◦ video stream, the user views the real-time stream through a Web platform and controls the UAV remotely using his body movements on the VR Treadmill platform.
The Edge server comprises the streaming module, control and monitoring module and the web server.
This application is composed of an optimal that allows transmitting the real-time video stream to the web application with the lowest latency possible. It provides 360◦ video to any device able to access the web.
The web server serves the WebVR application. It is the interface to the user to view the information status of the UAV and 360◦ video stream through an HTML5 video player adapted to play 360◦ video. It manages the video stream from the streaming module and synchronizes different video inputs (UAV video streams) with the outputs (video players who are requesting a given stream). The 360◦ video can be viewed by any device able to access a web browser. The choice of WebVR was mainly due to allowing an immersive view to any device that has access to a web browser starting from a simple card box to a HMD.
Control and monitoring module
It is in charge of two functions: i) forwarding the user control commands from the web application to the flight controller module, and ii) updating the user about the censorial information of the UAV such as altitude, latitude, longitude and speed, as well as LTE and 5G-relevant information from the modem that is connected to the UAV. This information is integrated within the 360◦ immersive view of the HMD. It is visualized by clicking on virtual elements within the immersive view
The treadmill is used to track the user movements (walking speed, heading, height) and translates them into drone maneuvers. We implemented an algorithm that gets the different sensors measurements from the treadmill platform and translates them into drone commands. We also exploited the drone’s sensors feedback to implement the so called haptic feedback. The implemented model takes into consideration the different drones’ movements and translates them into vibrating frequencies that are sensed by the VR User while flying the drone. Different maneuvers will lead to different torques at the copter motors level which would be translated into different vibration amplitudes at the treadmill level by the implemented algorithm. The implemented haptic feedback algorithm maps one to one the vibrations frequencies into different frequencies and amplitudes to give to the user different senses for different maneuvers.
A demo has been done at GITEX, the world’s largest tech event in Dubai. We had a user controlling a drone using his body movements based on the 4K 360 video feedback from Dubai while the drone was in Abudhabi which makes it around 100km with an end-to-end video delay <500ms.
We also tested the setting from Aalto University while the drone was in TII, Abudhabi UAE where a user would control the drone using VR joysticks and we had an end-to-end delay of 900ms.