archive // 2006.08.03 08:44:23 [hh]
3D-Visualisierung: Microsoft zeigt erste "Photosynth" Preview
Microsoft Live Labs zeigt auf seiner Web-Site eine interessante neue Technologie, die es erlaubt, Bilder automatisch so anzuordnen, dass eine panoramische dreidimensionale Ansicht entsteht.
Von der Web-Site der Entwickler:
What is it?
With Photosynth you can:
- Walk or fly through a scene to see photos from any angle.
- Seamlessly zoom in or out of a photograph whether its megapixels or gigapixels in size.
- See where pictures were taken in relation to one another.
- Find similar photos to the one youre currently viewing.
- Explore a custom tour.
- Send a collection to a friend.
We plan on releasing some collections for you to see and experience yourself in the very near future, so add us to your RSS reader and check back often for updates. Wed love to hear your ideas and suggestions and help us shape Photosynth over the months and years to come. In the meantime, take a look at the videos and learn a little bit more about how Photosynth works.
Curious about who we are? Were a team of Microsoft scientists and engineers on a mission to fundamentally rethink the internet experience, experiment with entirely new paradigms and change the world through ideas and software. Our founder Dr. Gary Flake, started Live Labs in February 2006 and since then many members of the team have been working around the clock to develop their concepts into a reality. Photosynth is just the start of what we hope will be a series of technologies that will change the way you use the internet.
How do you do it?
A Photosynth experience begins with nothing more than a bunch of digital photos. They might all have been taken by you, or they might be a mixture of images from many different cameras, shooting conditions, dates, times of day, resolutions, and so on.
Each photo is processed by computer vision algorithms to extract hundreds of distinctive features, like the corner of a window frame or a doorhandle. Then, photos that share features are linked together in a web. When a features found in multiple images, its 3D position can be calculated. Its similar to depth perceptionwhat your brain does to perceive the 3D positions of things in your field of view based on their images in both of your eyes. Photosynths 3D model is just the cloud of points showing where those features are in space.
Your brain knows that your eyes are about two inches apart. But when Photosynth does its magic, it doesnt know where the cameras were, or which way they were pointing. Fortunately, when there are many cameras, and many features in common, the algorithms behind Photosynth can figure out not only where the features are in 3D, but where all of the cameras would have to have been, and which way they were aimed, consistent with the features they "saw".
The Photosynth client shows you the 3D pointcloud, but more importantly, it also shows you the original pictures overlaid on the model. Imagine a slide projector placed at each camera position, aimed where the camera was aiming, and projecting the picture that camera took. A screen is placed in the 3D environment at an appropriate distance from the projector. As you move around in the Photosynth environment, projectors turn on and off, giving you a changing perspective on a world built entirely out of the original photos.
Weitere Informationen: labs.live.com/ photosynth