Hello there, i am
Matheus D. Negrão

PhD Student in Virtual Reality and Mixed Reality

Curriculum Vitae Publications

about me

PhD Student and Researcher of Virtual and Mixed Reality, focused on health applications.

He is pursuing his PhD in the Postgraduate Program in Computing (PPGC) of the Federal University of Rio Grande do Sul (UFRGS), Brazil, focusing on researching Mixed Reality approaches to multi-user collaboration in health education and training environments, advised by Prof. Luciana Nedel and Prof. Anderson Maciel. Matheus holds a Master's in Computer Science (2021-2023) from UFRGS, advised by Prof. Anderson Maciel, in the same research area, focusing on Augmented Reality Interfaces. He holds a Bachelor's in Computer Science from the Federal University of Fronteira Sul, Brazil.
During the last two years, Matheus has also worked as Head of Innovation on MetaHealth, leading the development team, connecting with medical schools to execute the solution, and making presentations and demos of the project for potential new users. This correlated R&D project concentrates on immersive environments for medical training clinical skills, interviewing, and the practice of a medical student before immediate contact with patients.

email

mdnegrao@inf.ufrgs.br

education

2024 - Current

PhD in computer science

Federal University of Rio Grande do Sul (UFRGS)

Porto Alegre, Brazil

Currently, the beggning focus is on the 1-1 guidance approach for the first contact with training environments, how to handle the virtual medical instruments, and how to interact with the virtual patient, planning to expand an approach for a large team to participate in an immersive environment through asymmetric collaboration. Finally, get an application for multidisciplinary team collaboration in a virtual reality environment, addressing some challenges in these conditions with all its particularities.

2021 - 2023

Master of computer science

Federal University of Rio Grande do Sul (UFRGS)

Porto Alegre, Brazil

My master's thesis focus on Characterizing Multiple Affordances using AR for Laparoscopy, and planning a mixed-reality application Remote Collaboration in Minimally Invasive Abdominal Surgery (MIAS). An augmented reality interface for novice surgeons to place annotations and training procedures was developed, besides planning theconnection through a mixed reality application to a remote expert surgeon for collaboration.

2017 - 2021

Bachelor of computer science

Federal University of Fronteira Sul (UFFS)

Chapecó, Brazil

The development of a methodology to streamline and facilitate the modeling of non-trivial behavior in general simulation software. The work presents the process of building a configurable component that represents the behavior of events that may occur during the procedures of passengers' access to the departure lounge at airport terminals.

Publications

  • Characterizing head-gaze and hand affordances using AR for laparoscopy

    doi.org/10.1016/j.cag.2024.103936


    2024

    Computers & Graphics - Special Section on XR for Health

    Laparoscopic surgery techniques are complex and impose postural and communication constraints on the surgeon that may affect surgery outcomes. This paper explores the possibilities of designing intraoperative AR interfaces for laparoscopy surgery. We suggest that the laparoscopic video be displayed on an AR headset and that surgeons consult preoperative image data on that display. Interaction with these elements is necessary. Thus, we propose a head-gaze and hand clicker approach that is effective and minimalist, as well as the implementation of a prototype. We conduct a user study to evaluate the prototype and to comprehend the impact and improvements that headgaze average filtering and the scale method can bring to perform annotations in the laparoscopy video feed on a virtual monitor positioned straight in front of the user. The user experiment was performed in a between-subject protocol with 32 volunteers from the Institute of Informatics, and the proposed task involves communication in the interface through drawing annotations with proposed interaction approaches: A hand device for input confirmations and the head-gaze stabilization methods to pointing and selection. The study found that the users were confident about their performance and demonstrated low physical and temporal demand. The proposed head-gaze methods showed that independent of the stabilization applied, the difference between the error sensibility of the axis of the head-gaze in annotation positioning is significant. The vertical axis presented a higher error rate than the horizontal axis. When we compared other variables, we found some differences in specific circumstances, but overall, the interaction with HL1 is very distributed.

  • 2023

    Proceedings of the 25th Symposium on Virtual and Augmented Reality

    Virtual simulation has been used for decades to improve the practice of technical skills by physicians. However, an important aspect of medical education is learning and practicing non-technical skills, which are commonly taught in role-playing sessions. Using VR simulation for such kind of training would optimize supervision time and allow for training in multiple scenarios. In this paper, we introduce a framework and application to support healthcare professionals in learning multi-step procedures and assess students' and candidates' readiness. The framework is an immersive system that allows users to perform general healthcare tasks in a natural environment. It does not require constant monitoring by an expert, making it accessible to larger groups of students and candidates with reduced time and personnel resources. The design also considers individual adaptation to virtual reality by providing in-game ergonomics and interaction accuracy settings. The system was tested through a user study with a pediatrician scenario and underwent a preliminary evaluation in a major hospital.

    Design and think-aloud study of an immersive interface for training health professionals in clinical skills

    doi.org/10.1145/3625008.3625037


  • Exploring affordances for AR in laparoscopy

    doi.org/10.1109/VRW58643.2023.00037


    2023

    IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)

    This paper explores the possibilities of designing AR interfaces to be used during laparoscopy surgery. It suggests that the laparoscopic video be displayed on AR headsets and that surgeons can consult preoperative image data on that display. Interaction with these elements is necessary, and no patterns exist to design them. Thus the paper proposes a head-gaze and clicker approach that is effective and minimalist. Finally, a prototype is presented, and an evaluation protocol is briefly discussed.

contact me

Matheus D. Negrão

PhD Student and Researcher


email

mdnegrao@inf.ufrgs.br

website

manegrao.github.io

Address

Porto Alegre, Brazil