Project Summary

This study explored taking a live video stream and processing it to simulate three dimensions. The luminosity of each video pixel was analyzed and mapped to the depth of individual cells in an array of extruded rectilinear volumes. This technique of mapping luminosity to depth is identical to rendering physical textures in most 3d rendering engines.



Concept and code are original, using Processing and its built in video library. Note that only a standard web-cam was employed in this project and that the study took place more than one year before the Microsoft Kinect was released.

Project Overview

Is there a way to simply visualize a massive dataset like 80+ years of stock market data? This programming project investigated this question and visualized Dow Jones Industrial Average data for instantaneous, simple analysis.

Link to the LIVE INTERACTIVE PROGRAM.
Concept and Code by Digital Noah. Programmed with Processing.

Video Demonstration