Email:
Edgar Rojas Muñoz
Early career Faculty working in Mixed
Reality and Human-Computer Interaction
ABOUT ME
Born and raised in the tropical Costa Rica, my journey now has me exploring the thrilling corn mazes from Indiana. I am recent PhD graduate in Industrial Engineering working on Human-Computer Interaction under the supervision of Dr. Juan Wachs. I received my Licenciatura (Lic.) degree in Computer Engineering at Instituto Tecnológico de Costa Rica. I am interested on how human collaboration, understanding and learning can be enhanced and assessed by leveraging Virtual and Augmented Reality techniques and non-verbal communication cues (e.g. gestures).
A Licenciatura is a degree technically higher than a Bachelor's degree, but technically lower than a Master's degree.
As a member of the Intelligent Systems and Assistive Technologies Laboratory (ISAT Lab), I developed theories and technology to aid people, primarily in the healthcare domain. I have recently been working in a project for the United States Department of Defense to create the System for Telementoring with Augmented Reality (STAR). With STAR, surgeons in medical facilities or austere environments can receive surgical instruction from a remotely located specialist, directly into their field of view.
EDUCATION
Instituto Tecnológico de Costa Rica, Cartago. Costa Rica
Licenciatura in Computer Engineering
GPA: 83.5/100, Class Rank 3
Thesis Title: Research, Design and Construction of a Telementoring Interface for the Recognition and Capture of Real-Time Touch Gestures.
2016 - 2010
Licenciatura Degree
Purdue University, West Lafayette, IN. United States
PhD in Industrial Engineering
Current GPA: 3.75/4.00
Dissertation Title: Assessing Collaborative Physical Tasks via Gestural Analysis using the "MAGIC" Architecture.
Final Examination to be held on July 2020
2016 - 2020
Doctoral Degree
When I am not burrowed in the lab, I spend my time hiking, playing board games and as a in-progress novelist. If you want to learn more about this, feel free to check my Beyond Research section. You might also contact me if you have any questions, I would love to hear from you.
EXPERIENCE
2016 - 2020
Research Assistant
Purdue University, West Lafayette, IN. United States
Responsible for a variety of tasks, including my doctoral thesis project that explores how to evaluate the collaboration between agents by analyzing the gestures they use. In addition, I am part of a surgical telementoring project sponsored by the Department of Defense.
Purdue University, West Lafayette, IN. United States
Researched and developed a large-scale interactive display based in touch inputs. The device was one of the main components of a telementoring system used by expert surgeons to transfer surgical instruction remotely.
2015 - 2016
Undergraduate Research Assistant
2014 - 2015
Lecturer Assistant
Instituto Tecnológico de Costa Rica, Cartago, Costa Rica
The assistantship included the design, lecturing, grading, and student mentoring of two undergraduate level courses with an average of 40 students each: "Introduction to Coding" and "Coding Workshop".
2014 - 2015
Undergraduate Research Assistant
Instituto Tecnológico de Costa Rica, Cartago, Costa Rica
Lead researcher in the development and use of a Display Wall with autostereoscopic screens. In addition, the work included the maintenance of the laboratory visualization clusters.
NOTABLE PROJECTS
The goal of this multidisciplinary project is to provide combat medics with real-time assistance using augmented reality. The local medic wears an augmented reality head-mounted display that displays surgical instructions authored by a remotely-located mentor. These annotations are displays onto the patient’s body, at the correct position and depth (Video).
This project developed an architecture capable of assessing task performance through gesture-related metrics. The architecture leverages a data structure capable of abstracting gestures’ morphology, meaning and context. This project allows to compare gestures performed by different agents (e.g. mentor/mentee)
MAGIC
Multi-Agent Gestural Instruction Comparer
This project develops artificial intelligence to assist surgeons through medical procedures. The system uses a tablet to acquire the view of the operating field. These images are used as input for a Deep Neural Network. The network predicts text descriptions representing the instructions to perform in the current step of the procedure. These instructions are then conveyed to the surgeons via text-to-speech (Video).
STAR
System for Telementoring With Augmented Reality
AI-Medic
Artificial Intelligent Mentor for Trauma Surgery