Event Type: 
Section Meeting
Event Date: 
Thursday, May 23, 2013 - 18:30 to 21:30
2 Jack London Square
94607 Oakland, CA
United States
Event Details: 

I think it was ABC that coined the phrase ‘Up Close and Personal’ for their Olympic Games broadcasts, which seemingly defines the viewer’s experience of watching a live event on TV.  By the nature of the television screen’s size and single point of focus of a lens, live event coverage is more of a point of view and less of the experience of being immersed into the event when attending.


Our meeting this month will discuss methods and insights from a lab/business-unit collaborative project in multi-view imaging and display. Focusing on better understanding of the potential of new immersive technologies by developing demonstrations and experiments performed within the context of corporate customer interests, the demonstrations are centered around using multi-viewpoint capture, binocular, and multi-scopic displays for immersive -- very large scale -- 3D entertainment experiences.


Much of what will be described are lessons learned outside of success, as gear has been taken into the field for life-sized 3D capture and presentation to large audiences -- a dangerous place for unproven technologies. Being in an area where much is yet to be learned, and feeling that a good way to change this is to just boldly go, the experiments have resulted in insights and accolades equally. Still, this is a progression of the technology, and the developers are confident of the end being a worthwhile pursuit, if not an eventuality, with success promising huge commercial potential.


Our presenter Harlyn Baker has a long history in multiple-image computer vision, from early 3D modeling in Edinburgh, to stereo in his PhD at Champaign-Urbana and the Stanford AI Lab, a dozen years at SRI where he co-developed Epipolar Plane Image (EPI) Analysis, four years at Interval Research, and a dozen more at HP Labs.

Harlyn's EPI work, called seminal, has been instrumental in most latter developments on Light Field analysis, including Ray Space and Hogel formulations. On leaving Interval Research, he was co-founder of TYZX; joined HP Labs in 2000, where he designed and developed camera systems to support multi-view studies, demonstrated automultiscopic imaging and display systems for 3D interaction and immersive experiences; and has most recently been exploring how 3D capture can impact print photography.


All SMPTE section meetings are free to members and guests.


Please register (no charge) at Eventbrite.