Field of activity

Research by the IRA2 team (Interaction, Virtual & Augmented Reality, Ambient Robotics) aims to improve interactions between people and complex artificial systems (virtual, augmented, robotized, computer applications). The diversity, complexity and unpredictability of the tasks to be carried out, as well as technological heterogeneity, make it necessary to design, produce, and evaluate appropriate digital assistance tools.

To achieve this objective, activities are organized around two research axes:

  • Axis 1: Perception, Interpretation & Decision (PID)

  • Axis 2: Interaction-Human-System (IHS)

Axis 1: Perception, Interpretation & Decision (PID)

This area aims to provide these artificial systems with capabilities for :

1) perceive the environment (real or virtual), and capture users in real-time and using a variety of data sources from on-board sensors, communicating objects, and VR/AR interfaces;

2) interpretation and even understanding of perceived data, as well as algorithms for dynamic adaptation to an evolving context and decision support.

Perception of the real or virtual environment :

  • Objectives: To enable artificial systems to enhance, complement, restore, and predict reality by providing sensory assistance to a user (group of users) or robot in its real world (in the case of augmented reality) or virtual world (in the case of virtual reality simulation).
  • Problem: Overcome the barriers to: 1) real-time compliance between real and virtual worlds; 2) detection, localization, mapping of environments and robust multi-sensor tracking of targets (objects, users or robots);
  • Approaches: descriptor-based approaches, model-based image analysis/synthesis, SLAM based on beam fitting and planar modeling.
  • Keywords: Augmented reality, virtual reality, sensor calibration, 3D modeling and registration, prediction, 2D/3D matching, robust hybrid tracking, localization.

Interpretation & decision :

  • Objectives: To enable an artificial system to analyze and interpret the data it receives, and to build decision-support algorithms that facilitate the use of the data by a user (or group of users) or by a robot.
  • Challenge: Overcome the challenges of: 1) context-dependent adaptability of Human-Computer Interaction; 2) gesture and emotion recognition by a robot; 3) context adaptation for ambient and personal assistance.
  • Approaches: Heterogeneous data processing and fusion, classification, multi-agent cooperation, coalition and negotiation mechanisms, presence models.
    Keywords: Augmented reality, virtual reality, machine learning, context-aware, multi-agent systems, cobotics.

Axis 2: Human-System Interaction (HSI)

This area aims to formalize concepts and develop interaction and communication techniques in virtual reality, augmented reality or robotics that are multimodal and adapted to humans in their environment (real or virtual), but also adapted to an assistant robot evolving in its environment. This environment can be either indoor or outdoor.

Interaction in real, augmented or virtual environments:

  • Objectives: Offer the user (or group of users) natural, multimodal 3D interaction techniques (navigation, selection, manipulation, application control) to improve task and user performance (notably by reducing learning curves and ensuring multisensory coherence).
  • Challenge: To overcome the challenges of: 1) maintaining sensory coherence in space and time between a user's action and the reaction of AR/VR systems; 2) the limitations of characteristic AR/VR interfaces and task constraints; 3) improving interface usability and patient motivation; 4) the fidelity of 3D interactions in virtual simulators; 5) rapid prototyping of mobile AR applications.
  • Approaches: User-centered design, approaches based on adhesion/motivation mechanisms (motivation theories), virtual guide mechanisms, groupware functional cloverleaf model, presence models, and component-based architecture.
  • Keywords: Augmented reality, virtual reality, 3D interaction, natural interaction, tangible interfaces, human factors, collaborative work, software architecture.

Human-Robot Interaction & Cobotics :

  • Objectives: Enable natural, multimodal human-robot interactions (remote or physical).
  • Issue: Overcome the obstacles to: 1) teleoperation in the presence of delays; 2) recognition of gestures and emotions by a robot; 3) malleability of human-robot collaborative systems; 4) human-robot co-perception (sharing of senses and knowledge between man and robot to build a mutual and enhanced perception).
  • Approaches: human-robot interaction modeling, co-presence model, learning and adaptation to changes (environment/user).
  • Keywords: Human-robot interaction, augmented reality, virtual reality, teleoperation, learning, cobotics.

Application fields: Health & Industry

  • Personalized medicine
  • Intelligent, sustainable, and cooperative mobility

 

Last deposits

Chargement de la page

Full-text documents

295

Bibliographical notes

317

Keywords

Underwater Coordination Depth image Control Mixed Reality Neural Network Tracking Omnidirectional camera 3D gesture recognition Localization Virtual environment Facial expression recognition Simulation Pose estimation Classification Evaluation tool Feature extraction Human computer interaction Cognitive stimulation Deep learning Réalité virtuelle Gesture tracking Cameras Autism Calibration Adaptation Web services Emotion recognition Robot Face matching Estimation Manipulability Argumentation Modelling Real-time tracking 3D interaction techniques Mixed reality Synthesized face Collaborative interaction Apprentissage automatique Réalité Virtuelle Rehabilitation Kinect Réalité Augmentée Object detection Particle filter Alzheimer disease Humanoid Robot Cooperation Interaction techniques Outdoor augmented reality Convolutional neural networks Context-awareness Timed automata Motion capture Collaborative virtual environments Tailorability Agent architecture Deep Learning Neural Architecture Search Réalité augmentée Machine Learning Augmented reality Artificial intelligence Virtual reality CSCW Human tracking Virtual reality VR Rigid tracking Non-rigid tracking CNN 3D face tracking Modélisation Coopération Camera pose estimation Collaboration Machine learning Humanoid robot Teleoperation Empirical evaluation Telerobotics Interval analysis 3D pose tracking ROV Surgical training 3D interaction Collaborative work Distortion Architecture 3D visualization Regression Interaction 3D Computer Vision SLAM Augmented Reality Virtual Reality 3D Interaction Computer vision Robotics Human performance