Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Doctor Q

Administrator
Original poster
Staff member
Sep 19, 2002
39,855
7,725
Los Angeles
The School of Engineering at the University of California at San Diego has a cool new technology to provide live 3D walkthrus of a remote location using feeds from multiple still and/or video cameras at the scene, by doing real-time stitching of the inputs. The more they know about the orientation of the various cameras (which can be camera phones, webcams, or helmet-mounted cameras), the better the results.

Here is a research news release.

The video (6:53 in length) at the bottom of the page is fun to watch (sorry, RealPlayer, not QuickTime), especially if (like me) you're familiar with the Price Center patio area at UCSD, where they did their experiment.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.