Special track 1:
Exploiting Artificial Intelligence to improve eXtended Reality capabilities (AI4XR)
Roberto Pierdicca, Emanuele Frontoni, and Marina Paolanti
(Polytechnic University of Marche, Italy)
Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR) are nowadays recognized under the umbrella of eXtended Reality (XR), describing those applications designed to enhance human perception through digital-generated contents. Their use proved to be valuable in almost every domain, ranging from a wide number of disciplines such as medicine, industry, gaming, cultural heritage just to mention some.
Technologies reached a degree of maturity never seen in the past. Devices are powerful, infrastructural limitations have been overcome, user interface and user experience are incredibly fascinating, and accuracy is getting more and more reliable. At the same time, Artificial Intelligence (AI) revolutionized how scientists face and conduct their experiments. AI is outperforming conventional approaches of data processing and interpretation, and, thanks to the availability of large-scale datasets and faster calculation tools, deep learning approaches are becoming overwhelming among scientists and developers. There is still a gap to be filled between XR and AI since the convergence between these disciplines is partially unexplored.
The purpose of this special track is to invite researchers to share innovative and original works where XR applications are combined with AI-based approaches.
Papers dealing with the following areas, but not limited to, are welcome:
AI technologies for VR/AR/XR
- Content creation and modeling
- Generation of immersive environments and virtual worlds
- AI-based platforms for XR, cloud-based platforms
- Tracking, physical environment mapping, registration
- Standards and theoretical models for AI and/or VR
- Applications and use cases
- Ethical and societal aspects of AI and XR
AI technologies for interactive and responsive environments
- Multimodal interaction and experiences in XR
- Machine learning for multimodal interaction
- Human-virtual user/agent interaction
- Human to human communication in virtual environments, collaboration, and communication
- Dialogue modeling and generation, conversational and natural language interfaces, speech interaction for XR
- Navigation and spatial orientation in VR
Special track 2:
eXtended Learning (XL)
Roberto Pierdicca, Emanuele Frontoni, and Marina Paolanti
(Polytechnic University of Marche, Italy)
Nowadays, in education environments, new opportunities offered by ICT, induce teachers to exploit new methods to improve the quality of learning. In fact, the technology proved to be a helpful aid in several domains including education. Technology allows to ease the teaching methods and increasing the performances by introducing affordable and reliable means to convey digital contents. Studies revealed that Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR), nowadays recognized under the umbrella of eXtended Reality (XR), have a great potential to help two kinds of users: on one side the students, improving their knowledge and skills; on the other teachers, widening their teaching methods. Besides the educational aspect, which can benefit from a more efficient and stimulating experience, XR can be a valuable aid for distance learning, which will be fundamental in emergency situations.
The aim of this special track is to bring together practitioners and researchers studying eXtended Reality for learning, including methodologies, technologies, software, learning outcomes evaluation, and educational data analysis. It will represent a forum to exchange good practices and points of view about the future of eXtended Reality technologies for the teaching-learning process. Additionally, given the great impact that the Covid-19 pandemic has had in the domain of, case studies exploiting eXtended Reality for distance learning will be more than welcome.
Papers dealing with the following areas, but not limited to, are welcome:
- Educational Platforms for eXtended Learning
- XR applications for Education
- Learning Analytics
- Evaluation for XR-based education
- Distance and E-Learning with XR
- Security Aspects
- Standards and Interoperability
- Ontologies and Meta-Data Standards
- Learning assessment Methodologies
- Knowledge retention
- Collaborative Learning
- New Teaching strategies based on XR
Special track 3:
Integrating eXtended Reality and Brain-Computer Interfaces (XR-BCI)
Nicola Moccaldi (University of Naples, Italy), and Antonio Esposito (Polytechnic University of Turin, Italy)
In spite of their high level of maturity, the continuous technological advancements constantly lead to discovering the unexploited potential of Augmented Reality (AR) and Virtual Reality (VR). On the other hand, also Brain-Computer Interfaces (BCIs) are becoming increasingly accessible for daily-use applications. Indeed, thanks to the availability of new hardware and software solutions, BCI systems are now more affordable and robust in terms of performance. A recent and challenging research trend is to combine the use of eXtended Reality (XR) with BCI, so as to exploit the specific advantages of both technologies. While XR could increase system wearability and user engagement, BCI could provide a novel interaction modality (e.g. free-hands interaction). Starting from these considerations, this special session is open to research or review contributions related to the most recent advancements in the multidisciplinary approach to research in XR for applications in BCI.
Papers dealing with the following areas, but not limited to, are welcome:
- Instrumental solutions and measurement principles for enhancing the accuracy and robustness of AR/VR and BCI systems
- Display technologies and human vision
- Wearable sensors
- User experience, perception, and interactions in AR/VR and BCI
- Multisensory experiences and improved immersion
- Applications and case studies
- Accuracy and latency versus wearability
- BCI paradigms (passive, reactive, active)
- Psychophysical condition monitoring
- Deep learning-based classification
- VR-supported Mindfulness based on EEG-signals
- Immersive user experience with VR/AR-BCI