12,297
edits
Glozt100sob (talk | contribs) No edit summary |
|||
(14 intermediate revisions by 3 users not shown) | |||
Line 1: | Line 1: | ||
== | == Extended View Toolkit == | ||
Authors: [[User:PeterV|Peter Venus]], Marian Weger, Cyrille Henry, Winfried Ritsch | |||
Download full paper: [[Media:Extended View Toolkit.pdf]] | |||
The Extended View Toolkit is a set of abstractions developed using | '''Project Site''': [http://extendedview.mur.at extendedview.mur.at], the toolkit may be downloaded there. | ||
The '''Extended View Toolkit''' is a set of abstractions developed using Pd/GEM and the openGL capabilities of GEM. | |||
The motivation to create systems, that are able to produce media with a wider projection area lies in our own visual and aural capabilities: nature gave us the ability to gather visual information with an angle of almost 180 degrees and to detect acoustic sensations in 360 degrees around us. Although relevant information gets filtered, so that there is a center of focus we can concentrate on, these abilities form our main ambience perception. Even more, we tend to "turn our heads" towards an event that attracts our attention. | The motivation to create systems, that are able to produce media with a wider projection area lies in our own visual and aural capabilities: nature gave us the ability to gather visual information with an angle of almost 180 degrees and to detect acoustic sensations in 360 degrees around us. Although relevant information gets filtered, so that there is a center of focus we can concentrate on, these abilities form our main ambience perception. Even more, we tend to "turn our heads" towards an event that attracts our attention. | ||
Line 13: | Line 15: | ||
The source for the image-merging (stitching) part is generated with a multiple-camera system, where the cameras are horizontally aligned on a circle to cover a panoramic-viewing area. | The source for the image-merging (stitching) part is generated with a multiple-camera system, where the cameras are horizontally aligned on a circle to cover a panoramic-viewing area. | ||
With the help of the toolkit, those sources can then be combined into one continuous image or video stream. The abstractions of the toolkit are taking care of the edge-blending, as well as the correction of lens-distortion, which is caused by optics and arrangement of the cameras. | With the help of the toolkit, those sources can then be combined into one continuous image or video stream. The abstractions of the toolkit are taking care of the edge-blending, as well as the correction of lens-distortion, which is caused by optics and arrangement of the cameras. | ||
Those features are implemented with the help of openGL shaders in order to keep | Those features are implemented with the help of openGL shaders in order to keep CPU-usage at a minimum. | ||
The second part of this set of abstractions aims at the creation of multi-screen and multi-projector environments, not only to display a correct representation of the generated panoramic material, but to enable easy creation of immersive visual media environments and video mapping. | The second part of this set of abstractions aims at the creation of multi-screen and multi-projector environments, not only to display a correct representation of the generated panoramic material, but to enable easy creation of immersive visual media environments and video mapping. | ||
Line 21: | Line 23: | ||
The Extended View Toolkit was originally developed at the IEM for the Comedia-Project “Extended View Streamed“. | The Extended View Toolkit was originally developed at the IEM for the Comedia-Project “Extended View Streamed“. | ||
See also [[PDCON:Workshops/Gem|Extended View Toolkit workshop]] | |||
<videoflash type="vimeo">37005750|700|400</videoflash> | |||
{{Template:PdCon11}} |