12,297
edits
mNo edit summary |
mNo edit summary |
||
Line 11: | Line 11: | ||
The source for the image-merging (stitching) part is generated with a multiple-camera system, where the cameras are horizontally aligned on a circle to cover a panoramic-viewing area. | The source for the image-merging (stitching) part is generated with a multiple-camera system, where the cameras are horizontally aligned on a circle to cover a panoramic-viewing area. | ||
With the help of the toolkit, those sources can then be combined into one continuous image or video stream. The abstractions of the toolkit are taking care of the edge-blending, as well as the correction of lens-distortion, which is caused by optics and arrangement of the cameras. | With the help of the toolkit, those sources can then be combined into one continuous image or video stream. The abstractions of the toolkit are taking care of the edge-blending, as well as the correction of lens-distortion, which is caused by optics and arrangement of the cameras. | ||
Those features are implemented with the help of openGL shaders in order to keep | Those features are implemented with the help of openGL shaders in order to keep CPU-usage at a minimum. | ||
The second part of this set of abstractions aims at the creation of multi-screen and multi-projector environments, not only to display a correct representation of the generated panoramic material, but to enable easy creation of immersive visual media environments and video mapping. | The second part of this set of abstractions aims at the creation of multi-screen and multi-projector environments, not only to display a correct representation of the generated panoramic material, but to enable easy creation of immersive visual media environments and video mapping. |