Currently, Project Glass uses a touch pad that runs down the side of one of its arms. Trouble is, that means reaching up every time you need to adjust a setting. This idea, though, would use a laser projector to throw a control pad onto any surface that you're looking at: wall, desk, arm, whatever. Then, a small camera would interpret finger movements in the region of those buttons and turn them into commands. Simple.
This is well-trodden ground, of course: there have been oh-so-many laser projectors designed to throw a keyboard onto a desk in front of you. But they were static; Google's offering would be much trickier to pull off and use. More here.