With natural binocular vision, our eyes view objects at slightly varying angles from one another to produce a pair of distinct perspectives which our brains then interpret as a single 3-D image. The Mono-glass system, developed by a team from the University of Yamanashi, replicates this process using commercially available components as stand-ins for the non-functional fleshy bits.
The current design iteration of Mono-glass relies on Wrap 920AR augmented reality glasses, normally used for working in Autodesk 3ds Max, to act as artificial eyes, generating images with a pair of integrated cameras. The team's custom software then processes this information to calculate the relative distance of each item in the field of view and synthesize the data into a single image. This image is then displayed in the patient's good eye with close objects appearing in focus while progressively distant items grow increasingly blurry, like the one below. More here.