Virtual Reality and haptic feedback, or how to revolutionise the automotive sector

26 September 2019
realite virtuelle banner
At AUSY’s innovation hub, several projects are chosen each year to be developed by the Augmented Reality/Virtual Reality team.

2019 is the year of the car at AUSY. Embedded electronics and new technologies are becoming increasingly important in vehicle design. The proof? Connected vehicles are all the rage. There will be 220 million by 2025. Totally driverless vehicles (i.e. category 5 vehicles that require no human intervention or special road conditions) are extremely popular among industry professionals and the general public (although none of these vehicles are currently on the roads). However, we expect to there to be around 80 million by 2030. Their arrival is imminent! Virtual Reality (which plunges the user into a virtual world based on a model of a real environment) and Augmented Reality (which displays virtual information in 2D or 3D in a real environment) are two technologies that can effectively meet the current and future challenges of automotive development.

It is in this sense that we chose to design a driving simulator with a dual objective:

  • to demonstrate our expertise in Virtual Reality and to model a real environment in 3D.
  • We used existing material resources to create haptic feedback, which improves the immersive experience of the driver (force feedback from the steering wheel, vibrations in the bucket seat, and sensations of acceleration and braking from the pedal).

 

A simulated and immersive driving experience

We had a very specific goal when we embarked on this large-scale venture i.e. to enable inexperienced drivers to understand vehicle handling and the different rules of safe driving, while also introducing them to Virtual Reality.

The aim of this simulation is to plunge the user in an immaterial world that is both realistic and immersive. First and foremost, it demonstrates that just about anything is possible with Virtual Reality. We wanted to recreate a wide range of driving sensations and human sensations (sight, touch and hearing) to maximise the user’s sense of being a part of this virtual environment.

 

But what does “immersive simulation” actually mean?

Simulation allows us to reproduce a real environment in a virtual world thanks to cutting-edge modelling and virtualisation techniques.

We use the term “immersion” because everything is designed to make the user “forget” that he/she is actually inhabiting a world that is not real, all thanks to autonomous entities, haptic feedback, audio spatialising, etc.

We will discuss these topics later in this article.

Visuals

realite virtuelle visuel

The first step is to create a virtual décor for the subject to inhabit. It must resemble a realistic physical environment i.e. one which we interact with daily. So, it comprises buildings, streets, decorative elements and entities that move around thanks to Artificial Intelligence.

In order transform our real environment into a virtual one, we use modelling techniques that rely on two main software programmes:

  • Blender, which sculpts, textures and animates elements in 3D, and
  • Unity 3D, which makes it possible to assemble all the different elements required to recreate the world; the various interactions and functionalities are programmed in C #.

To achieve maximum realism, we also consider lighting, textures, visual effects and post-processing effects to obtain a high quality image within the constraints of Virtual Reality.

 

realite virtuelle visuel 2

To enter this virtual world, the driver must don a headset that completely surrounds his/her field of vision. The headset sends micro images to each eye, which are then collected and turned into 3D images by our brain. Sensors enable the software to adapt the image so that it tracks the movements of the user’s head. To maximise the driver’s immersive experience, the images must be as fluid as possible. Therefore, the software transmits between 90 to 120 images per second. As a comparison, a television transmits around 30 images per second. The higher the number of images per second, the more fluid and pleasant the experience will be for the user.

Autonomous entities

realite virtuelle entites automones

To better mimic our world, we have developed autonomous entities that reproduce behaviours we would normally see in reality. These entities help raise the driver’s awareness, as well as instilling good driving reflexes. To define their trajectories, we used Artificial Intelligence and, more specifically, the A* pathfinding algorithm. It is widely employed thanks to its ease of use. It has been designed so that the first route (or path) found is the shortest between two points. It also has impressive calculation speeds albeit to the detriment of accurate results. In the case of our solution, we believed the most important factor was the calculation speed in order to react rapidly to the driver’s range of behaviours. To perfect our algorithm, we used Unity’s NavMesh system. This tool is - to our entities - what a road map is to humans. It helps the AI define the best itinerary for different situations. It also enables us to define priority rules in the event of a conflict.

Audio

Hearing is an incredibly important sense. If the user hears noises from the real world, the immersive experience will be spoilt and not be as conclusive. To mitigate this, we created a spatialised audio environment. This consisted of recreating a sound landscape for the virtual environment in which sounds are heard from all directions.

To do this, we used Google’s Resonance Audio development tool, which is compatible with Unity.

Resonance Audio takes into account all the parameters including the materials of the objects in the 3D world in order to propagate sound in a hyper realistic way.

 

realite virtuelle bonhomme 3d

Source : resonance-audio.github.io

But how does this actually work?

In the real world, our brain is capable of working out the distance and direction of a sound thanks to sound wave sensors in our ears. Google’s Resonance Audio tool simulates this natural phenomenon. Once immersed in the simulation, the driver thinks he/she can hear sounds from various positions nearby. The software then adapts to the user’s behaviour. If a sound comes from the left and the driver moves to the right then the sound will get quieter i.e. in relation to the distance travelled by the driver.

We generally source our sounds from online audio banks. If necessary, we can record our own sounds to perfectly recreate reality.

 

Haptic feedback

AUSY attaches great importance to haptic feedback and touch in its Virtual Reality simulations. Haptic feedback refers to the vibrations or sensations that can be felt when we touch something. A good example is that of smartphones and, more specifically, Apple phones. When a user touches the screen of their phone for a longer moment, he/she will feel a slight vibration. The development of haptic feedback has grown exponentially in recent years. It is particularly popular among video game developers at it gives players a better immersive experience. Car manufacturers are also very keen on this technology. BMW recently equipped its in-car screens with haptic feedback (the sensation is the same as smartphone screens), while Renault sends perceptible vibrations to the interior of the car to warn the driver that he/she is moving off without indicating.

 

realite virtuelle materiel

In our driving simulation, we included a steering wheel, pedals and a bucket seat with force feedback so that the user really feels like he/she is in a car. We coded the simulation with a complete set of vibrations that we can activate according to the situation. So, if the user skids or makes an emergency stop, he/she will feel vibrations in the steering wheel thanks to an integrated motor.

Other devices are compatible with Virtual Reality, such as gloves (Glove One, Manus VR, etc.), which allow touch (textures, hot, cold, etc.) and shoes (currently under development) that allow the driver to feel the texture of the traversed terrain (hard, granular, soft, sandy, etc.).

Each simulation is more or less adapted to different types of haptic devices. In our case, a steering wheel with force feedback is sufficient to create a rich and immersive experience. Thanks to Logitech’s SDK, we have been able to recreate the vibrations and resistance forces we feel when we brake, skid or spin.

Conclusion

In the context of this Virtual Reality driving simulation project, AUSY has clearly demonstrated its expertise in 3D development and haptic feedback with Virtual Reality. Immersive simulations are also great tools for automotive industry professionals. They make it possible to improve vehicle ergonomics (lighting, layout, interior design, etc.) and conduct tests on the development of new technologies, as well as low-cost, safe driver assistance systems.

The next steps will be to use these various technological bricks to develop algorithms for autonomous cars. These tests will enable us to check the performance of these algorithms in a virtual world before conducting full-scale tests.

 

 

realite virtuelle auteur
Passionate about new technologies and innovation, Anthony specialises in developing immersive simulations in Virtual Reality/Augmented Reality using Unity 3D. He loves to learn and share his knowledge. He has already published many books on developing real-time 3D solutions, for example: Developing innovative apps with Unity (Editions d-booker, 13 April 2017) in which he explains how to use Virtual Reality to create six innovative applications. Anthony is head of AUSY’s 3D/RA/RV Hub in Sophia Antipolis. With the project team, he helps customers develop their innovative projects.

 

Do not hesitate to check out our other page on Smart Innovation et Automotive

Let's have a chat about your projects

bouton-contact-en