Public Speaking Training and Virtual Worlds

According to experts from universities, consultancies and public authorities, Immersive Technologies and Virtual Worlds (VWs) offer huge opportunities to transform society by enhancing education. A qualified workforce is indeed key to a long-term socio-economic development and VWs can help to reach this objective. Public Speaking Skills (PSS) are particularly sought after since required in numerous management contexts (pitching a product to investors, interacting with customers/collaborators…). Despite its societal impact, PSS training using Virtual Reality (VR) has received few attentions in the literature.
Our project aims to fill this gap by considering the development process of efficient PSS training modules: from theory to practice, with a complete field-tested solution. Tailor-made state-of-the-art environments are developed. Although focused on PSS training using VR, it will contribute to broader research by addressing unanswered questions about the use of immersive technologies. The creation of virtual crowds is an essential step in this project since crucial in any VW and even more in the context of PSS. Such environments should display a virtual audience which interacts appropriately with the users by adopting realistic social behaviors.
...


Public Speaking Training

We have developed several environments to meet the specific needs of different disciplines. Some features of our VR environments:

  • Specific training scenarios developed by experts.
  • Easy access and use: developed for the Meta Quest VR headsets in autonomous mode. No computer needed.
    Hands detection: no need of controllers/joysticks.
  • Multi-users. You can be the trainee or, for more interactions or for more complex scenarios, be a member of the audience...from anywhere in the world.
  • High quality immersion and high level of presence. Truly immersive environments with many advanced features like photo-realistic avatars based on real persons, libraries of validated virtual agent attitudes, artificial intelligence for more interactions...
  • Replay. After your performance, relive it in 3D, this time as a member of the audience.
  • Advanced feedback. Companion website where you (or your professor) can upload your slides and notes, watch a 2D replay and have access to several key performance indicators.


High School classroom

Partners: M-N. Hindryckx, A. Huby, C. Cravatte (Unit for Science Awakening and Biology Didactics)

Virtual reality environment created for our colleagues from the Faculty of Science who are responsible for training future secondary school biology teachers. The trainee teacher is immersed in a virtual secondary school classroom facing around twenty adolescents. The behavior of these students depends on scenarios defined by our colleagues. This is a multi-user environment. Some of the virtual teenagers can be embodied by members of the teaching staff or by other students wearing VR headsets, while still appearing as teenage students. This allows the audience's behavior to go beyond predefined scenarios, creating a highly realistic situation for the trainee teacher. It also eliminates the typical bias found in role-playing exercises where audience members are known to the participant and are not perceived as actual secondary school students.


Board room and Meeting room

Whether delivering a progress report, presenting a proposal to a board, pitching a product or service to clients or investors, or leading a lecture or workshop, effective public speaking is a core professional skill. Yet for many, public speaking remains a source of significant anxiety. While traditional training often relies on in-person coaching, VR offers a promising alternative by providing realistic, interactive, and repeatable scenarios within a controlled environment. Our system offers, by default, a range of possible audience attitudes: from supportive and engaged to negative or passive. Artificial intelligence techniques are integrated to allow the audience’s attitude to be influenced in real time based on verbal and non-verbal behaviors.


Courtroom

Partners: Y-H. Leleu (ULiège Faculty of Law) - J.W. Cho, T. Jung (Manchester Metropolitan University)


Office - recruitment process

Partners:


Auditorium

Partners:


Master thesis, entrepreneurship student presentation

Partners:


EVE : Emotional Voice Expressions, an acted audiovisual corpus

We present a project aimed at creating a high-quality, phonetically-balanced emotional database (audio & video, in French & English) containing Ekman's six basic emotions and other emotions common to public speaking. The aim of this database will be to automatically detect an individual's emotions using artificial intelligence. For each language, we have designed two studies. The first involves recording phonetically balanced sentences with the 10 emotions (and 2 different levels of intensity) produced by 12 actors. The second involves the validation of each recording by 25 different judges. We believe that our project will be particularly useful for public speaking training and for the creation of virtual reality training environments.
In addition, our database will be public and open to the scientific community, and can be used for training purposes.
Database available soon.