AR Gestural Controls for SPHEREs Satellites - Work with HSL-AeroAstro-MIT



description


Worked with the Human Systems Lab (HSL) in MIT AeroAstro to develop gestural controls for MIT-developed SPHEREs (Synchronized Position Hold Engage and Reorient Experimental Satellite), making it possible to interface with and control the satellites with Microsoft HoloLens augmented reality via Unity in order to investigate how immersive technology can improve an astronaut’s spatial awareness when performing extravehicular activities. This also included designing a training program to onboard users with the SPHEREs/HoloLens technology.

skills used/developed


- Unity/C#
- Augmented Reality
- Microsoft HoloLens
- Experiment Design
- Human Factors Engineering
- Human-centered Studies
- Training Program Design

documentation


The current methods to complete exterior inspection tasks and EVA require using ISS exterior cameras, the Canadarm, and humans themselves. This project proposes a way to more efficiently complete these tasks using the SPHERES robots as a testbed for future free-flyer inspectors. For example, one of the goals is to transform the traditional 2D environment that is created by a camera into a 3D model that a human could interpret faster and use to make better informed decisions. To help understand how humans will be interacting with the SPHERES robots, the project will be focused on examining different modes of communicating with the SPHERES, including gestural control. My project will focus on developing gestural controls for controlling the SPHERES motion using an augmented reality interface, the HoloLens, to navigate a simulated virtual environment. To complete this research, ground-based SPHERES will be utilized. Unlike the SPHERES in space with six degrees of freedom, these will only involve 3 degrees of freedom. They operate in a similar manner as a hovercraft, gliding on a cushion of air. These simulated tests will help us to improve and understand the interaction between humans and the SPHERES robots, allowing us to improve the controls technology and the virtual environment.

The overall aim of this project was to develop a method by which autonomous robotic systems (with some aspect of human control) can replace human function and their part in completing tasks such as EVAs and exterior inspection tasks, such as looking for micrometeorites. The use of SPHERES in this research will help us better understand and model what the ideal autonomous robotic system would entail and how humans could efficiently interact with it.

Personal Contribution:
My work in this project included developing and implementing gestural controls for controlling the SPHERES motion using Unity and the Microsoft Hololens. Gesturally-controlled Augmented Reality had not previously been applied to a scenario of commanding free-flying robots on-orbit. This work included utilizing gaze tracking and tap/pinch/two-handed gestures (for rotating/scaling/moving objects) to set waypoints for path planning in order to inspect anomalies around the simulated space station (tested with the SPHEREs). This work supported Jessica Eve Todd's MIT Master's Thesis, 'Commanding small satellites for simulated spacecraft inspections using augmented reality'.

Available HoloLens interactions (all images from linked paper):


Hololens Interactions

Two-Handed Gestural Controls:


Two-Handed Gestural Controls

Waypoint Controls:


Waypoint Controls

Controls in Simulation:


Waypoint Controls Sim

Proposed SPHEREs Experimental Setup:


Sim Setup