Exploring Optical Projection as a Metaphor for Multi-Device Interaction
Portable projectors in mobile devices provide a promising way to overcome screen-space limitations on handhelds, navigate information, or augment reality. One of their appeals is the simplicity of interaction: Aiming at an appropriate surface projects the image, and changing posture and direction adjusts the image's position and orientation. This behavior is purely based on optics, allowing us to intuitively grasp it based on our own experience with the physical world. However, strict adherence to the laws of physics also has its drawbacks: The intensity of light varies with the projector's distance to the surface, and the projected image is tightly coupled to the projector's movement.
In this paper, we apply the metaphor of optical projection to digital surfaces in the environment. We use a handheld device, tracked in 6 DOF, to support Virtual Projection (VP) on one or more displays. The simulated nature of VP allows us to address some of the limitations of optical projection, avoiding unwanted distortions, jitter, and intensity variations, and eliminating the need to continually point the projector at the surface on which it is projecting. This also frees the frustum so that it can be used for selecting areas, either for navigation or for applying filters.
Our work makes several contributions: (1) We explore the implications of VP as an interaction technique and show how decoupling the projection from the projector and ad-justing transformations can improve interaction. (2) We describe relevant characteristics of VP. (3) We present an implemented software framework for creating VP applications for consumer smartphones that does not require external tracking, and show exemplary use cases. (4) We report on a user study comparing VP with both absolute and relative techniques for content placement using a handheld device. Our findings suggest that VP is especially suitable for complex (i.e., translate-scale-rotate) projections.