Currently two lenses are used to capture a subject from slightly different perspectives. But the technique developed by Kenneth Crozier and Anthony Orth at SEAS achieves the same result, but using software. An algorithm creates a 3D movie using two pictures taken from a stationary camera but at different focus depths.
A mathematical model calculates at what angle the light is striking each pixel. It does this by comparing the slight differences between two images taken from the same position but focused at different depths. The two images can then be stitched together in an animation that gives the impression of a stereo.
Dubbed "light-field moment imaging," allows single lens cameras to produce 3D images. At the moment the camera aperture must be wide enough to let in light from a wide range of angles. This rules out a smartphone lens but a standard 50 mm lens on a single-lens reflex (SLR) camera does the job.