Animation

Property of DreamWorks Animation

Layout, camera, and animation by Eric Hayes. Created with How to Train Your Dragon 2 assets in Premo. Music by John Powell.

Mars VR

Princeton University Thesis. Advisor: Szymon Rusinkiewicz

With input from NASA JPL, I used photogrammetry and machine learning to build an immersive reconstruction of the Mars surface from images taken by the Curiosity rover. The experience is a prototype for astronaut training and mission operations applications.

An image colorization convolutional neural network was trained using full color rover images. Then grayscale navigation images (which are more plentiful and cover more of the Martian surface) were colorized and reconstructed into 3D models using structure-from-motion and multi-view stereo. The environments were then stitched together into an immersive VR training experience.

github.com/hayesman/MarsVR

VR Preview (Unreal Engine)

Above: Original Curiosity rover navigation images (Courtesy of NASA/JPL).  Below: Colors predicted using a convolutional neural network trained with real color images of Mars.

Above: Original Curiosity rover navigation images (Courtesy of NASA/JPL).

Below: Colors predicted using a convolutional neural network trained with real color images of Mars.

Meshed and textured 3D model of the Mars surface.

Meshed and textured 3D model of the Mars surface.

Meshed and textured 3D model of the Mars surface.

Meshed and textured 3D model of the Mars surface.

Motion Capture

 

A motion capture dance project by Eric Hayes and Jeff Snyder. Mo-cap data captured with a Vicon system, processed in Python, visuals created in Max MSP using OpenGL, controlled via MIDI and rendered in real time.

Compositing