Project Selector:

Select Category:
Select Project:

3d Extruded Live Video
Project Summary

This study explored taking a live video stream and processing it to simulate three dimensions. The luminosity of each video pixel was analyzed and mapped to the depth of individual cells in an array of extruded rectilinear volumes. This technique of mapping luminosity to depth is identical to rendering physical textures in most 3d rendering engines.

Concept and code are original, using Processing and its built in video library. Note that only a standard web-cam was employed in this project and that the study took place more than one year before the Microsoft Kinect was released.