Touch-Holographic interaction
There is a new generation of autostereoscopic (no-glasses 3D) displays hitting the market. These displays send a separate image to your left and right eye, but unlike older generations, the new displays include a built-in eye tracking camera so that 3D imagery can respond to the position of the user fast enough to create the illusion of a hologram.
Several of these displays look like normal 3D monitors, but I think the best one is the Sony Spatial Reality Display (SRD), because the display design forces the illusion of a volumetric box that the 3D content sits inside.
Sony provides plugins for Unreal Engine and Unity, but our team found them cumbersome for prototyping with rapidly changing content, and integrating different sensors and data.
Process: I wanted to drive the SRD from TouchDesigner to enable faster development and integration with other technologies. I contacted Sony to request API access, and put together a small C++ bridge application that interfaces with the Sony API and sends the head and eye positions into TouchDesigner via OSC.
TouchDesigner is handling the perspective correct rendering. To make the illusion accurate I modeled the physical display at the correct scale in 3D and created virtual cameras that have their positions track with the positions of the left and right eyes. Because the display is angled up, the rendering needs to accommodate the difference of the screen angle and the viewer's eye. The rendered images for the left and right eye are interleaved by the SRD to create the 3D illusion.