Metagenomic AR Visualization – Atlas In Silico

Participated in a team to realize a 60-monitor stereoscopic VR experience that allowed the user multimodal interaction with over the millions of new proteins from Craig Venter’s CAMERA project; which followed Darwin’s path, sampling and sequencing the ocean at various points.

My role was to allow user interaction with data without physical device; just by moving hands (this was in 2007 – before Kinect, leap, etc).

In 3 weeks I built a functional physical cursor at greater than 60 fps, as well as driver to support input into existing VR framework (Covise).

7 years later, when project moved to  UNT in Denton TX, set it all back up , and mentored students who were able to update the interaction to use a new camera, and new display to support new exhibitions.

Project Site

Project Paper

Tools: C/C++, openCV, Computer Vision, OpenGL